Compare commits
1 commit
master
...
1.6.x-stab
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b2f83fea76 |
77 changed files with 5334 additions and 4695 deletions
8
.gitignore
vendored
8
.gitignore
vendored
|
|
@ -1,4 +1,4 @@
|
|||
pom.xml*
|
||||
pom.xml
|
||||
*jar
|
||||
/lib/
|
||||
/classes/
|
||||
|
|
@ -8,9 +8,3 @@ checkouts/*
|
|||
doc/*
|
||||
deploy.docs.sh
|
||||
target/*
|
||||
todo.org
|
||||
.nrepl-*
|
||||
.idea/
|
||||
*.iml
|
||||
/.clj-kondo/.cache
|
||||
/.lsp/.cache
|
||||
|
|
|
|||
19
.travis.yml
19
.travis.yml
|
|
@ -1,20 +1,11 @@
|
|||
language: clojure
|
||||
sudo: required
|
||||
lein: lein
|
||||
dist: xenial
|
||||
lein: lein2
|
||||
before_script:
|
||||
# Give MongoDB server some time to boot
|
||||
- sleep 15
|
||||
- mongod --version
|
||||
- ./bin/ci/before_script.sh
|
||||
script: lein do clean, javac, test
|
||||
script: lein2 all do clean, javac, test
|
||||
jdk:
|
||||
- openjdk10
|
||||
- oraclejdk11
|
||||
- openjdk12
|
||||
- openjdk6
|
||||
- openjdk7
|
||||
- oraclejdk7
|
||||
services:
|
||||
- mongodb
|
||||
branches:
|
||||
only:
|
||||
- master
|
||||
- 3.5.x-stable
|
||||
|
|
|
|||
|
|
@ -1,11 +1,10 @@
|
|||
## Pre-requisites
|
||||
|
||||
The project uses [Leiningen 2](http://leiningen.org) and requires a recent MongoDB to be running
|
||||
locally. Make sure you have those two installed and then run tests against all supported Clojure versions using
|
||||
|
||||
./bin/ci/before_script.sh
|
||||
lein all do clean, javac, test
|
||||
The project uses [Leiningen 2](https://leiningen.org) and requires MongoDB `2.4+` to be running
|
||||
locally. Make
|
||||
sure you have those two installed and then run tests against all supported Clojure versions using
|
||||
|
||||
lein2 all test
|
||||
|
||||
## Pull Requests
|
||||
|
||||
|
|
|
|||
353
ChangeLog.md
353
ChangeLog.md
|
|
@ -1,356 +1,3 @@
|
|||
## Changes between 3.5.x and 3.6.0 (unreleased)
|
||||
|
||||
### UUID Representation Option
|
||||
|
||||
Added a new connection option, `:uuid-representation`.
|
||||
|
||||
Contributed by @okorz001.
|
||||
|
||||
GitHub issue: [#212](https://github.com/michaelklishin/monger/issues/212)
|
||||
|
||||
### Operator List Update
|
||||
|
||||
For MongoDB 4.x.
|
||||
|
||||
Contributed by @mjrb.
|
||||
|
||||
GitHub issue: [#196](https://github.com/michaelklishin/monger/pull/196)
|
||||
|
||||
### Dependency Update
|
||||
|
||||
Contributed by @robhanlon22.
|
||||
|
||||
GitHub issue: [#206](https://github.com/michaelklishin/monger/pull/206)
|
||||
|
||||
|
||||
## Changes between 3.1.x and 3.5.0 (Dec 10th, 2018)
|
||||
|
||||
### MongoDB Java Driver Update
|
||||
|
||||
MongoDB Java driver dependency has been updated to `3.9.x`.
|
||||
|
||||
This means that Monger now **requires JDK 8**.
|
||||
|
||||
Contributed by @Linicks.
|
||||
|
||||
### 3rd Party Library Compatibility
|
||||
|
||||
* Cheshire `5.8.x`
|
||||
* clj-time `0.15.1`
|
||||
* ring-core `0.15.1`
|
||||
* Ragtime `0.7.x`.
|
||||
|
||||
### URI Connection Usability Improvement
|
||||
|
||||
URIs that don't specify a database will now be rejected as invalid.
|
||||
|
||||
Contributed by Chris Broome.
|
||||
|
||||
|
||||
## Changes between 3.0.x and 3.1.0 (September 17th, 2016)
|
||||
|
||||
### MongoDB Java Driver Update
|
||||
|
||||
MongoDB Java driver dependency has been updated to `3.3.0`.
|
||||
|
||||
### Cursor Hinting Option Fix
|
||||
|
||||
Contributed by Stijn Opheide.
|
||||
|
||||
### Improved DBObject to Clojure Map conversion performance
|
||||
|
||||
New `from-db-object` implementation for `DBObject` avoids creation of an unnecessary
|
||||
sequence and instead directly accesses `DBObject` instance in reduce. This should
|
||||
offer performance improvement of about 20%. A performance test can be found
|
||||
at [monger.test.stress-test](https://github.com/michaelklishin/monger/blob/master/test/monger/test/stress_test.clj).
|
||||
|
||||
Contributed by Juho Teperi.
|
||||
|
||||
### Authencation Function No Longer Ignores Credentials
|
||||
|
||||
In some cases Monger ignored provided credentials.
|
||||
|
||||
Contributed by Artem Chistyakov.
|
||||
|
||||
### Macro Type Hint Fixes
|
||||
|
||||
Contributed by Andre Ambrosio Boechat.
|
||||
|
||||
|
||||
|
||||
## Changes between 2.1.0 and 3.0.0
|
||||
|
||||
Monger 3.0 is based on the [MongoDB Java driver 3.0](https://www.mongodb.com/blog/post/introducing-30-java-driver)
|
||||
and has some (relatively minor) **breaking API changes**.
|
||||
|
||||
### Error Handling Built Around Write Concerns
|
||||
|
||||
Monger no longer provides `monger.core/get-last-error`. It is no
|
||||
longer needed: write concerns and exceptions is now the primary way for clients
|
||||
to be notified of operation failures.
|
||||
|
||||
### New Authentication API
|
||||
|
||||
MongoDB 3.0 supports different authentication mechanisms. Multiple
|
||||
credentials can be specified for a single connection. The client
|
||||
and the server then can negotiate what authentication mechanism to use
|
||||
and which set of credentials succeed.
|
||||
|
||||
Monger introduces a new namespace for credential instantiation:
|
||||
`monger.credentials`. The most common function that relies on
|
||||
authentication mechanism negotiation is `monger.credentials/for`:
|
||||
|
||||
``` clojure
|
||||
(require '[monger.core :as mg])
|
||||
(require '[monger.credentials :as mcr])
|
||||
|
||||
(let [creds (mcr/for "username" "db-name" "pa$$w0rd")
|
||||
conn (mg/connect-with-credentials "127.0.0.1" creds)]
|
||||
)
|
||||
```
|
||||
|
||||
`mg/connect-with-credentials` is the most convenient function to
|
||||
connect with if you plan on using authentication.
|
||||
|
||||
When connecting using a URI, the API hasn't changed.
|
||||
|
||||
### monger.search is Gone
|
||||
|
||||
`monger.search` is gone. MongoDB 3.0 supports search queries
|
||||
using regular query operators, namely `$text`. `monger.operators` is
|
||||
extended to include `$text`, `$search`, `$language`, and `$natural`.
|
||||
|
||||
An example of a search query in 3.0:
|
||||
|
||||
``` clojure
|
||||
(require '[monger.core :as mg])
|
||||
(require '[monger.credentials :as mcr])
|
||||
(require '[monger.collection :as mc])
|
||||
(require '[monger.operators :refer [$text $search]])
|
||||
|
||||
(let [creds (mcr/for "username" "db-name" "pa$$w0rd")
|
||||
conn (mg/connect-with-credentials "127.0.0.1" creds)
|
||||
db (mg/get-db conn "db-name")]
|
||||
(mc/find-maps db "collection" {$text {$search "hello"}}))
|
||||
```
|
||||
|
||||
### Add allow-disk-use and Cursor Options to Aggregates
|
||||
|
||||
`monger.collection/aggregate` now supports `:cursor` and `:allow-disk-use` options.
|
||||
|
||||
Contributed by Bartek Marcinowski.
|
||||
|
||||
|
||||
### JSON Serialization of BSON Timestamps
|
||||
|
||||
JSON serialisation extensions now support BSON timestamps.
|
||||
|
||||
Contributed by Tom McMillen.
|
||||
|
||||
|
||||
|
||||
## Changes between 2.0.0 and 2.1.0
|
||||
|
||||
### Clojure 1.7 Compatibility
|
||||
|
||||
Monger now compiles with Clojure 1.7.
|
||||
|
||||
### MongoDB Java Driver Update
|
||||
|
||||
MongoDB Java driver dependency has been updated to `2.13.x`.
|
||||
|
||||
### $each Operator
|
||||
|
||||
The `$each` operator now can be used via `monger.operators`.
|
||||
|
||||
Contributed by Juha Jokimäki.
|
||||
|
||||
|
||||
## Changes between 1.8.0 and 2.0.0
|
||||
|
||||
`2.0` is a major release that has **breaking public API changes**.
|
||||
|
||||
### Explicit Connection/DB/GridFS Argument
|
||||
|
||||
In Monger 2.0, all key public API functions require an explicit
|
||||
DB/connection/GridFS object to be provided instead of relying on
|
||||
a shared dynamic var. This makes Monger much easier to use with
|
||||
systems such as Component and Jig, as well as concurrent
|
||||
applications that need to work with multiple connections, database,
|
||||
or GridFS filesystems.
|
||||
|
||||
In other words, instead of
|
||||
|
||||
``` clojure
|
||||
(require '[monger.collection :as mc])
|
||||
|
||||
(mc/insert "libraries" {:name "Monger"})
|
||||
```
|
||||
|
||||
it is now necessary to do
|
||||
|
||||
``` clojure
|
||||
(require '[monger.collection :as mc])
|
||||
|
||||
(mc/insert db "libraries" {:name "Monger"})
|
||||
```
|
||||
|
||||
This also means that `monger.core/connect!` and
|
||||
`monger.core/connect-via-uri!` were removed, as was
|
||||
`monger.multi` namespaces.
|
||||
|
||||
To connect to MongoDB, use `monger.core/connect`:
|
||||
|
||||
``` clojure
|
||||
(require '[monger.core :as mg])
|
||||
|
||||
(let [conn (mg/connect)])
|
||||
```
|
||||
|
||||
or `monger.core/connect-via-uri`:
|
||||
|
||||
``` clojure
|
||||
(require '[monger.core :as mg])
|
||||
|
||||
(let [{:keys [conn db]} (mg/connect-via-uri "mongodb://clojurewerkz/monger:monger@127.0.0.1/monger-test4")])
|
||||
```
|
||||
|
||||
To get a database reference, use `monger.core/get-db`, which now requires a connection
|
||||
object:
|
||||
|
||||
``` clojure
|
||||
(require '[monger.core :as mg])
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")])
|
||||
```
|
||||
|
||||
### Options as Maps
|
||||
|
||||
Functions that take options now require a proper Clojure map instead of
|
||||
pseudo keyword arguments:
|
||||
|
||||
``` clojure
|
||||
# in Monger 1.x
|
||||
(mc/update db coll {} {:score 0} :multi true)
|
||||
|
||||
# in Monger 2.x
|
||||
(mc/update db coll {} {:score 0} {:multi true})
|
||||
```
|
||||
|
||||
|
||||
|
||||
## Changes between 1.8.0-beta2 and 1.8.0
|
||||
|
||||
### Clojure 1.6
|
||||
|
||||
Monger now depends on `org.clojure/clojure` version `1.6.0`. It is
|
||||
still compatible with Clojure 1.4 and if your `project.clj` depends on
|
||||
a different version, it will be used, but 1.6 is the default now.
|
||||
|
||||
## Changes between 1.8.0-beta1 and 1.8.0-beta2
|
||||
|
||||
### monger.result Use with WriteConcerns is Deprecated
|
||||
|
||||
MongoDB Java driver 2.12.x [no longer guarantees connection affinity](https://github.com/mongodb/mongo-java-driver/releases/tag/r2.12.0-rc0) for thread pool
|
||||
threads.
|
||||
|
||||
This means that `WriteConcern#getLastError` is no longer a safe from concurrency
|
||||
hazards. Therefore the use of `monger.result` functions on `WriteConcern` instances
|
||||
is now **deprecated** in MongoDB Java client and Monger.
|
||||
|
||||
### MongoDB Java Driver Update
|
||||
|
||||
MongoDB Java driver dependency has been [updated to 2.12.x](https://github.com/mongodb/mongo-java-driver/releases/tag/r2.12.0-rc0).
|
||||
|
||||
### Default WriteConcern Change
|
||||
|
||||
Monger now uses [`WriteConcern/ACKNOWLEDGED`](http://api.mongodb.org/java/2.12/com/mongodb/WriteConcern.html#ACKNOWLEDGED) by default. Functionality-wise
|
||||
it is the same as `WriteConcern/SAFE` in earlier versions.
|
||||
|
||||
|
||||
## Changes between 1.7.0 and 1.8.0-beta1
|
||||
|
||||
### monger.core/connect-via-uri
|
||||
|
||||
`monger.core/connect-via-uri` is a version of `monger.core/connect-via-uri!`
|
||||
which returns the connection instead of mutating a var.
|
||||
|
||||
It should be used by projects that are built from reloadable
|
||||
components, together with `monger.multi.*`.
|
||||
|
||||
|
||||
## Changes between 1.7.0-beta1 and 1.7.0
|
||||
|
||||
### MongoDB Java Driver Update
|
||||
|
||||
MongoDB Java driver dependency has been [updated to 2.11.3](https://github.com/mongodb/mongo-java-driver/releases/tag/r2.11.3).
|
||||
|
||||
### Ragtime Dependency Dropped
|
||||
|
||||
Ragtime is now an optional dependency: if your project uses `monger.ragtime`, you
|
||||
need to add Ragtime to your own `project.clj`:
|
||||
|
||||
``` clojure
|
||||
[ragtime/ragtime.core "0.3.4"]
|
||||
```
|
||||
|
||||
### Validateur Dependency Dropped
|
||||
|
||||
[Validateur](http://clojurevalidations.info) is no longer a dependency.
|
||||
|
||||
|
||||
## Changes between 1.6.0 and 1.7.0-beta1
|
||||
|
||||
### Fune Tuning Cursor Options
|
||||
|
||||
`monger.query` DSL now provides a way to fine tune database cursor
|
||||
options:
|
||||
|
||||
``` clojure
|
||||
(with-collection "products"
|
||||
...
|
||||
(options {:notimeout true, :slaveok false}) ;; where keyword matches Bytes/QUERYOPTION_*
|
||||
(options [:notimeout :slaveok])
|
||||
(options com.mongodb.Bytes/QUERYOPTION_NOTIMEOUT) ;; support Java constants
|
||||
(options :notimeout)
|
||||
...
|
||||
```
|
||||
|
||||
`monger.cursor` is a new namespace that provides the plumbing for cursor
|
||||
fine tuning but should not be widely used directly.
|
||||
|
||||
|
||||
|
||||
### Joda Time Integration Improvements: LocalDate
|
||||
|
||||
`LocalDate` instance serialization is now supported
|
||||
by Monger Joda Time integration.
|
||||
|
||||
Contributed by Timo Sulg.
|
||||
|
||||
|
||||
### Clojure 1.3 Is No Longer Supported
|
||||
|
||||
Monger now officially supports Clojure 1.4+.
|
||||
|
||||
|
||||
### Cheshire Upgrade
|
||||
|
||||
[Cheshire](https://github.com/dakrone/cheshire) dependency has been upgraded to 5.2.0
|
||||
|
||||
|
||||
### ClojureWerkz Support Upgrade
|
||||
|
||||
ClojureWerkz Support dependency has been updated to `0.19.0`.
|
||||
|
||||
|
||||
### Validateur 1.5.0
|
||||
|
||||
[Validateur](https://github.com/michaelklishin/validateur) dependency has been upgraded to 1.5.0.
|
||||
|
||||
|
||||
|
||||
## Changes between 1.5.0 and 1.6.0
|
||||
|
||||
### monger.multi.collection
|
||||
|
|
|
|||
96
README.md
96
README.md
|
|
@ -1,24 +1,20 @@
|
|||
# Monger, a modern Clojure MongoDB Driver
|
||||
[](https://travis-ci.org/xingzhefeng/monger)
|
||||
|
||||
Monger is an idiomatic [Clojure MongoDB driver](http://clojuremongodb.info) for a more civilized age.
|
||||
|
||||
It has batteries included, offers powerful expressive query DSL,
|
||||
strives to support modern MongoDB features and have the "look and feel" and
|
||||
flexibility of the MongoDB shell.
|
||||
|
||||
Monger is built from for modern Clojure versions and sits on top of
|
||||
the official MongoDB Java driver.
|
||||
It has batteries included, offers powerful expressive query DSL, strives to support every MongoDB 2.0+ feature and has sane defaults. Monger is built from the
|
||||
ground up for Clojure 1.3+ and sits on top of the official MongoDB Java driver.
|
||||
|
||||
|
||||
## Project Goals
|
||||
|
||||
There is one MongoDB client for Clojure that has been around since 2009. So, why create another one? Monger authors
|
||||
wanted a client that would
|
||||
wanted a client that will
|
||||
|
||||
* Support most of modern MongoDB features, focus on those that really matter.
|
||||
* Support most of MongoDB 2.0+ features, focus on those that really matter.
|
||||
* Be [well documented](http://clojuremongodb.info).
|
||||
* Be [well tested](https://github.com/michaelklishin/monger/tree/master/test/monger/test).
|
||||
* Target modern Clojure versions.
|
||||
* Target Clojure 1.3.0 and later from the ground up.
|
||||
* Be as close to the Mongo shell query language as practical
|
||||
* Integrate with libraries like Joda Time, [Cheshire](https://github.com/dakrone/cheshire), clojure.data.json, [Ragtime](https://github.com/weavejester/ragtime).
|
||||
* Support URI connections to be friendly to Heroku and other PaaS providers.
|
||||
|
|
@ -27,17 +23,23 @@ wanted a client that would
|
|||
|
||||
|
||||
|
||||
## Community
|
||||
|
||||
[Monger has a mailing list](https://groups.google.com/forum/#!forum/clojure-mongodb). Feel free to join it and ask any questions you may have.
|
||||
|
||||
To subscribe for announcements of releases, important changes and so on, please follow [@ClojureWerkz](https://twitter.com/#!/clojurewerkz) on Twitter.
|
||||
|
||||
|
||||
## Project Maturity
|
||||
|
||||
Monger is not a young project: started in July 2011, it is over 7
|
||||
years old with active production use from week 1.
|
||||
Monger is not a young project: started in July 2011, it is over 1 year old with active production use from week 1.
|
||||
|
||||
|
||||
|
||||
## Artifacts
|
||||
|
||||
Monger artifacts are [released to
|
||||
Clojars](https://clojars.org/com.novemberain/monger). If you are using
|
||||
Maven, add the following repository definition to your `pom.xml`:
|
||||
Monger artifacts are [released to Clojars](https://clojars.org/com.novemberain/monger). If you are using Maven, add the following repository
|
||||
definition to your `pom.xml`:
|
||||
|
||||
``` xml
|
||||
<repository>
|
||||
|
|
@ -50,60 +52,69 @@ Maven, add the following repository definition to your `pom.xml`:
|
|||
|
||||
With Leiningen:
|
||||
|
||||
[com.novemberain/monger "3.5.0"]
|
||||
[com.novemberain/monger "1.6.0-beta2"]
|
||||
|
||||
|
||||
With Maven:
|
||||
|
||||
<dependency>
|
||||
<groupId>com.novemberain</groupId>
|
||||
<artifactId>monger</artifactId>
|
||||
<version>3.5.0</version>
|
||||
<version>1.6.0-beta2</version>
|
||||
</dependency>
|
||||
|
||||
|
||||
|
||||
## Getting Started
|
||||
|
||||
Please refer to our [Getting Started
|
||||
guide](http://clojuremongodb.info/articles/getting_started.html). Don't
|
||||
hesitate to join our [mailing
|
||||
list](https://groups.google.com/forum/#!forum/clojure-mongodb) and ask
|
||||
questions, too!
|
||||
Please refer to our [Getting Started guide](http://clojuremongodb.info/articles/getting_started.html). Don't hesitate to join our [mailing list](https://groups.google.com/forum/#!forum/clojure-mongodb) and ask questions, too!
|
||||
|
||||
|
||||
|
||||
|
||||
## Documentation & Examples
|
||||
|
||||
Please see our [documentation guides site](http://clojuremongodb.info/) and [API reference](http://reference.clojuremongodb.info).
|
||||
|
||||
Our [test suite](https://github.com/michaelklishin/monger/tree/master/test/monger/test)
|
||||
also has many code examples.
|
||||
Our [test suite](https://github.com/michaelklishin/monger/tree/master/test/monger/test) also has many code examples.
|
||||
|
||||
|
||||
## Community
|
||||
|
||||
[Monger has a mailing list](https://groups.google.com/forum/#!forum/clojure-mongodb). Feel
|
||||
free to join it and ask any questions you may have.
|
||||
|
||||
To subscribe for announcements of releases, important changes and so
|
||||
on, please follow [@ClojureWerkz](https://twitter.com/#!/clojurewerkz)
|
||||
on Twitter.
|
||||
|
||||
|
||||
## Supported Clojure versions
|
||||
|
||||
Monger requires Clojure 1.8+. The most recent
|
||||
stable release is highly recommended.
|
||||
Monger is built from the ground up for Clojure 1.3 and up. The most recent
|
||||
stable releaseis recommended.
|
||||
|
||||
|
||||
## Continuous Integration Status
|
||||
|
||||
[](http://travis-ci.org/michaelklishin/monger)
|
||||
[](http://travis-ci.org/michaelklishin/monger)
|
||||
|
||||
|
||||
|
||||
|
||||
## Monger Is a ClojureWerkz Project
|
||||
|
||||
Monger is part of the [group of Clojure libraries known as ClojureWerkz](http://clojurewerkz.org), together with
|
||||
[Cassaforte](http://clojurecassandra.info), [Langohr](http://clojurerabbitmq.info), [Elastisch](http://clojureelasticsearch.info), [Quartzite](http://clojurequartz.info) and several others.
|
||||
[Neocons](https://github.com/michaelklishin/neocons), [Langohr](https://github.com/michaelklishin/langohr), [Elastisch](https://github.com/clojurewerkz/elastisch), [Welle](https://github.com/michaelklishin/welle), [Quartzite](https://github.com/michaelklishin/quartzite) and several others.
|
||||
|
||||
|
||||
## Write Performance
|
||||
|
||||
Monger insert operations are efficient and have very little overhead compared to the underlying Java driver. Here
|
||||
are some (very unscientific) numbers on a MacBook Pro from fall 2010 with Core i7 and an Intel SSD drive:
|
||||
|
||||
```
|
||||
Testing monger.test.stress
|
||||
Inserting 1000 documents...
|
||||
"Elapsed time: 25.699 msecs"
|
||||
Inserting 10000 documents...
|
||||
"Elapsed time: 135.069 msecs"
|
||||
Inserting 100000 documents...
|
||||
"Elapsed time: 515.969 msecs"
|
||||
```
|
||||
|
||||
With the `SAFE` write concern, it takes roughly 0.5 second to insert 100,000 documents with Clojure 1.3.0.
|
||||
|
||||
|
||||
|
||||
|
|
@ -112,14 +123,7 @@ Monger is part of the [group of Clojure libraries known as ClojureWerkz](http://
|
|||
Monger uses [Leiningen 2](https://github.com/technomancy/leiningen/blob/master/doc/TUTORIAL.md). Make sure you have it installed and then run tests against
|
||||
supported Clojure versions using
|
||||
|
||||
./bin/ci/before_script.sh
|
||||
lein all do clean, javac, test
|
||||
|
||||
Or, if you don't have mongodb installed, you can use docker
|
||||
|
||||
docker-compose up
|
||||
./bin/ci/before_script_docker.sh
|
||||
lein all do clean, javac, test
|
||||
lein2 all do clean, javac, test
|
||||
|
||||
Then create a branch and make your changes on it. Once you are done with your changes and all tests pass, submit a pull request
|
||||
on Github.
|
||||
|
|
@ -128,7 +132,7 @@ on Github.
|
|||
|
||||
## License
|
||||
|
||||
Copyright (C) 2011-2018 [Michael S. Klishin](http://twitter.com/michaelklishin), Alex Petrov, and the ClojureWerkz team.
|
||||
Copyright (C) 2011-2013 [Michael S. Klishin](http://twitter.com/michaelklishin)
|
||||
|
||||
Double licensed under the [Eclipse Public License](http://www.eclipse.org/legal/epl-v10.html) (the same as Clojure) or
|
||||
the [Apache Public License 2.0](http://www.apache.org/licenses/LICENSE-2.0.html).
|
||||
|
|
|
|||
|
|
@ -1,18 +1,8 @@
|
|||
#!/bin/sh
|
||||
|
||||
# Check which MongoDB shell is available
|
||||
if command -v mongosh >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongosh"
|
||||
elif command -v mongo >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongo"
|
||||
else
|
||||
echo "Error: Neither mongo nor mongosh shell found. Please install MongoDB shell."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# MongoDB Java driver won't run authentication twice on the same DB instance,
|
||||
# so we need to use multiple DBs.
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test2
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test3
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test4
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test2
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test3
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test4
|
||||
|
|
|
|||
|
|
@ -1,18 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
# Check which MongoDB shell is available in the container
|
||||
if docker exec mongo_test which mongosh >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongosh"
|
||||
elif docker exec mongo_test which mongo >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongo"
|
||||
else
|
||||
echo "Error: Neither mongo nor mongosh shell found in the container."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# MongoDB Java driver won't run authentication twice on the same DB instance,
|
||||
# so we need to use multiple DBs.
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test2
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test3
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test4
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
# MongoDB seems to need some time to boot first. MK.
|
||||
sleep 5
|
||||
|
||||
# MongoDB Java driver won't run authentication twice on the same DB instance,
|
||||
# so we need to use multiple DBs.
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test2
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test3
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test4
|
||||
|
|
@ -1,8 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
|
||||
|
||||
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu xenial/mongodb-org/4.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-4.0.list
|
||||
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y mongodb-org
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
# Use root/example as user/password credentials
|
||||
version: '3.1'
|
||||
|
||||
services:
|
||||
|
||||
mongo:
|
||||
image: mongo
|
||||
container_name: mongo_test
|
||||
restart: always
|
||||
ports:
|
||||
- "27017:27017"
|
||||
62
project.clj
62
project.clj
|
|
@ -1,12 +1,13 @@
|
|||
(defproject com.novemberain/monger "4.0.0-SNAPSHOT"
|
||||
(defproject com.novemberain/monger "1.6.1-SNAPSHOT"
|
||||
:description "Monger is a Clojure MongoDB client for a more civilized age: friendly, flexible and with batteries included"
|
||||
:url "http://clojuremongodb.info"
|
||||
:min-lein-version "2.5.1"
|
||||
:license {:name "Eclipse Public License"
|
||||
:url "http://www.eclipse.org/legal/epl-v10.html"}
|
||||
:dependencies [[org.clojure/clojure "1.11.1"]
|
||||
[org.mongodb/mongodb-driver "3.12.11"]
|
||||
[clojurewerkz/support "1.5.0"]]
|
||||
:min-lein-version "2.0.0"
|
||||
:license {:name "Eclipse Public License"}
|
||||
:dependencies [[org.clojure/clojure "1.5.1"]
|
||||
[org.mongodb/mongo-java-driver "2.11.2"]
|
||||
[com.novemberain/validateur "1.4.0"]
|
||||
[clojurewerkz/support "0.15.0"]
|
||||
[ragtime/ragtime.core "0.3.3"]]
|
||||
:test-selectors {:default (fn [m]
|
||||
(and (not (:performance m))
|
||||
(not (:edge-features m))
|
||||
|
|
@ -27,32 +28,39 @@
|
|||
:all (constantly true)}
|
||||
:source-paths ["src/clojure"]
|
||||
:java-source-paths ["src/java"]
|
||||
:javac-options ["-target" "1.8" "-source" "1.8"]
|
||||
:javac-options ["-target" "1.6" "-source" "1.6"]
|
||||
:mailing-list {:name "clojure-mongodb"
|
||||
:archive "https://groups.google.com/group/clojure-mongodb"
|
||||
:post "clojure-mongodb@googlegroups.com"}
|
||||
:profiles {:1.10 {:dependencies [[org.clojure/clojure "1.10.2"]]}
|
||||
:1.9 {:dependencies [[org.clojure/clojure "1.9.0"]]}
|
||||
:profiles {:1.3 {:dependencies [[org.clojure/clojure "1.3.0"]]}
|
||||
:dj01x {:dependencies [[org.clojure/data.json "0.1.2" :exclusions [org.clojure/clojure]]]}
|
||||
:dj02x {:dependencies [[org.clojure/data.json "0.2.1" :exclusions [org.clojure/clojure]]]}
|
||||
:1.4 {:dependencies [[org.clojure/clojure "1.4.0"]]}
|
||||
:1.6 {:dependencies [[org.clojure/clojure "1.6.0-master-SNAPSHOT"]]}
|
||||
:master {:dependencies [[org.clojure/clojure "1.6.0-master-SNAPSHOT"]]}
|
||||
:dev {:resource-paths ["test/resources"]
|
||||
:dependencies [[clj-time "0.15.1" :exclusions [org.clojure/clojure]]
|
||||
[cheshire "5.8.1" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/data.json "2.5.0" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/tools.cli "0.4.1" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/core.cache "0.7.1" :exclusions [org.clojure/clojure]]
|
||||
[ring/ring-core "1.7.1" :exclusions [org.clojure/clojure]]
|
||||
[com.novemberain/validateur "2.6.0" :exclusions [org.clojure/clojure]]
|
||||
[ch.qos.logback/logback-classic "1.2.3" :exclusions [org.slf4j/slf4j-api]]
|
||||
[ragtime/core "0.7.2" :exclusions [org.clojure/clojure]]]
|
||||
:plugins [[lein-codox "0.10.5"]]
|
||||
:codox {:source-paths ["src/clojure"]
|
||||
:namespaces [#"^monger\.(?!internal)"]}}
|
||||
:dependencies [[clj-time "0.5.0" :exclusions [org.clojure/clojure]]
|
||||
[cheshire "5.0.2" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/tools.cli "0.2.1" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/core.cache "0.6.2" :exclusions [org.clojure/clojure]]
|
||||
[ring/ring-core "1.1.8"]]
|
||||
:plugins [[codox "0.6.4"]]
|
||||
:codox {:sources ["src/clojure"]
|
||||
:output-dir "doc/api"
|
||||
:exclude [monger.internal.pagination
|
||||
monger.internal.fn
|
||||
;; these are not fully baked yet or have changes
|
||||
;; that are not entirely backwards compatible with 1.0. MK.
|
||||
monger.testkit
|
||||
monger.ring.session-store]}}
|
||||
;; only clj-time/JodaTime available, used to test monger.joda-time w/o clojure.data.json
|
||||
:dev2 {:resource-paths ["test/resources"]
|
||||
:dependencies [[clj-time "0.15.2" :exclusions [org.clojure/clojure]]]}}
|
||||
:aliases {"all" ["with-profile" "dev:dev,1.10:dev,1.9:dev"]}
|
||||
:repositories {"sonatype" {:url "https://oss.sonatype.org/content/repositories/releases"
|
||||
:dependencies [[clj-time "0.5.0" :exclusions [org.clojure/clojure]]]}}
|
||||
:aliases {"all" ["with-profile" "dev:dev,1.3:dev,1.4:dev,dj01x:dev,dj02x:dev,1.6"]}
|
||||
:repositories {"sonatype" {:url "http://oss.sonatype.org/content/repositories/releases"
|
||||
:snapshots false
|
||||
:releases {:checksum :fail :update :always}}
|
||||
"sonatype-snapshots" {:url "https://oss.sonatype.org/content/repositories/snapshots"
|
||||
"sonatype-snapshots" {:url "http://oss.sonatype.org/content/repositories/snapshots"
|
||||
:snapshots true
|
||||
:releases {:checksum :fail :update :always}}})
|
||||
:releases {:checksum :fail :update :always}}}
|
||||
:aot [monger.conversion])
|
||||
|
|
|
|||
|
|
@ -1,41 +1,9 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.cache
|
||||
"clojure.core.cache implementation(s) on top of MongoDB.
|
||||
(ns ^{:doc "clojure.core.cache implementation(s) on top of MongoDB.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/integration.html"
|
||||
(:require [monger.collection :as mc :refer [find-one find-by-id find-map-by-id]]
|
||||
Related documentation guide: http://clojuremongodb.info/articles/integration.html"
|
||||
:author "Michael S. Klishin"}
|
||||
monger.cache
|
||||
(:require [monger.collection :as mc]
|
||||
[clojure.core.cache :as cache]
|
||||
[monger.conversion :as cnv])
|
||||
(:import clojure.core.cache.CacheProtocol
|
||||
|
|
@ -49,37 +17,87 @@
|
|||
(def ^{:const true}
|
||||
default-cache-collection "cache_entries")
|
||||
|
||||
|
||||
(defn- ^DBObject find-one
|
||||
[^DB db ^String collection ^Map ref]
|
||||
(.findOne (.getCollection db (name collection))
|
||||
(cnv/to-db-object ref)))
|
||||
|
||||
(defn- find-by-id
|
||||
"A version of monger.collection/find-by-id that does not require the
|
||||
fields argument"
|
||||
[^DB db ^String collection id]
|
||||
(find-one db collection {:_id id}))
|
||||
|
||||
(defn- find-map-by-id
|
||||
"A version of monger.collection/find-by-map-id that accepts database
|
||||
as an argument"
|
||||
[^DB db ^String collection id]
|
||||
(cnv/from-db-object ^DBObject (find-one db collection {:_id id}) true))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defrecord BasicMongerCache [db collection])
|
||||
(defrecord BasicMongerCache [collection])
|
||||
|
||||
(extend-protocol cache/CacheProtocol
|
||||
BasicMongerCache
|
||||
(lookup [c k]
|
||||
(let [m (mc/find-map-by-id (:db c) (:collection c) k)]
|
||||
(let [m (mc/find-map-by-id (:collection c) k)]
|
||||
(:value m)))
|
||||
(has? [c k]
|
||||
(not (nil? (mc/find-by-id (:db c) (:collection c) k))))
|
||||
(not (nil? (mc/find-by-id (get c :collection) k))))
|
||||
(hit [this k]
|
||||
this)
|
||||
(miss [c k v]
|
||||
(mc/insert (:db c) (:collection c) {:_id k :value v})
|
||||
(mc/insert (get c :collection) {:_id k :value v})
|
||||
c)
|
||||
(evict [c k]
|
||||
(mc/remove-by-id (get c :collection) k)
|
||||
c)
|
||||
(seed [c m]
|
||||
(mc/insert-batch (get c :collection) (map (fn [[k v]]
|
||||
{:_id k :value v}) m))
|
||||
c))
|
||||
|
||||
|
||||
(defn basic-monger-cache-factory
|
||||
([]
|
||||
(BasicMongerCache. default-cache-collection))
|
||||
([collection]
|
||||
(BasicMongerCache. collection))
|
||||
([collection base]
|
||||
(cache/seed (BasicMongerCache. collection) base)))
|
||||
|
||||
|
||||
(defrecord DatabaseAwareMongerCache [db collection])
|
||||
|
||||
(extend-protocol cache/CacheProtocol
|
||||
DatabaseAwareMongerCache
|
||||
(lookup [c k]
|
||||
(let [m (find-map-by-id (:db c) (:collection c) k)]
|
||||
(:value m)))
|
||||
(has? [c k]
|
||||
(not (nil? (find-by-id (:db c) (:collection c) k))))
|
||||
(hit [this k]
|
||||
this)
|
||||
(miss [c k v]
|
||||
(mc/insert (:db c) (:collection c) {:_id k :value v} WriteConcern/SAFE)
|
||||
c)
|
||||
(evict [c k]
|
||||
(mc/remove-by-id (:db c) (:collection c) k)
|
||||
c)
|
||||
(seed [c m]
|
||||
(mc/insert-batch (:db c) (:collection c) (map (fn [[k v]]
|
||||
{:_id k :value v}) m))
|
||||
{:_id k :value v}) m) WriteConcern/SAFE)
|
||||
c))
|
||||
|
||||
|
||||
(defn basic-monger-cache-factory
|
||||
([^DB db]
|
||||
(BasicMongerCache. db default-cache-collection))
|
||||
([^DB db collection]
|
||||
(BasicMongerCache. db collection))
|
||||
([^DB db collection base]
|
||||
(cache/seed (BasicMongerCache. db collection) base)))
|
||||
(defn db-aware-monger-cache-factory
|
||||
([db]
|
||||
(DatabaseAwareMongerCache. db default-cache-collection))
|
||||
([db collection]
|
||||
(DatabaseAwareMongerCache. db collection))
|
||||
([db collection base]
|
||||
(cache/seed (DatabaseAwareMongerCache. db collection) base)))
|
||||
|
|
|
|||
|
|
@ -1,40 +1,13 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; Copyright (c) 2012 Baishampayan Ghose
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; Copyright (c) 2012 Baishampayan Ghose
|
||||
;;
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.collection
|
||||
"Provides key functionality for interaction with MongoDB: inserting, querying, updating and deleting documents, performing Aggregation Framework
|
||||
|
|
@ -50,19 +23,16 @@
|
|||
* http://clojuremongodb.info/articles/updating.html
|
||||
* http://clojuremongodb.info/articles/deleting.html
|
||||
* http://clojuremongodb.info/articles/aggregation.html"
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty? any? update])
|
||||
(:import [com.mongodb Mongo DB DBCollection WriteResult DBObject WriteConcern
|
||||
DBCursor MapReduceCommand MapReduceCommand$OutputType AggregationOutput
|
||||
AggregationOptions AggregationOptions$OutputMode]
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty?])
|
||||
(:import [com.mongodb Mongo DB DBCollection WriteResult DBObject WriteConcern DBCursor MapReduceCommand MapReduceCommand$OutputType]
|
||||
[java.util List Map]
|
||||
[java.util.concurrent TimeUnit]
|
||||
[clojure.lang IPersistentMap ISeq]
|
||||
org.bson.types.ObjectId)
|
||||
(:require [monger.core :as mc]
|
||||
[monger.result :as mres]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.constraints :refer :all]
|
||||
[monger.util :refer [into-array-list]]))
|
||||
(:require monger.core
|
||||
monger.result)
|
||||
(:use monger.conversion
|
||||
monger.constraints))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
|
|
@ -74,17 +44,28 @@
|
|||
;;
|
||||
|
||||
(defn ^WriteResult insert
|
||||
"Saves document to collection and returns a write result monger.result/acknowledged?
|
||||
and related functions operate on. You can optionally specify a WriteConcern.
|
||||
"Saves @document@ to @collection@ and returns write result monger.result/ok? and similar functions operate on. You can optionally specify WriteConcern.
|
||||
|
||||
In case you need the exact inserted document returned, with the :_id key generated,
|
||||
use monger.collection/insert-and-return instead."
|
||||
([^DB db ^String coll document]
|
||||
(.insert (.getCollection db (name coll))
|
||||
In case you need the exact inserted document returned, with the :_id key generated, use monger.collection/insert-and-return
|
||||
instead.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
;; returns write result
|
||||
(monger.collection/insert \"people\" {:name \"Joe\", :age 30})
|
||||
|
||||
(monger.collection/insert \"people\" {:name \"Joe\", :age 30, WriteConcern/SAFE})
|
||||
"
|
||||
([^String collection document]
|
||||
(.insert (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object document)
|
||||
^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll document ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name coll))
|
||||
^WriteConcern monger.core/*mongodb-write-concern*))
|
||||
([^String collection document ^WriteConcern concern]
|
||||
(.insert (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object document)
|
||||
concern))
|
||||
([^DB db ^String collection document ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name collection))
|
||||
(to-db-object document)
|
||||
concern)))
|
||||
|
||||
|
|
@ -92,30 +73,49 @@
|
|||
(defn ^clojure.lang.IPersistentMap insert-and-return
|
||||
"Like monger.collection/insert but returns the inserted document as a persistent Clojure map.
|
||||
|
||||
If the :_id key wasn't set on the document, it will be generated and merged into the returned
|
||||
map."
|
||||
([^DB db ^String coll document]
|
||||
(insert-and-return db coll document ^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll document ^WriteConcern concern]
|
||||
;; MongoDB Java driver will generate the _id and set it but it
|
||||
;; tries to mutate the inserted DBObject and it does not work
|
||||
;; very well in our case, because that DBObject is short lived
|
||||
;; and produced from the Clojure map we are passing in. Plus,
|
||||
;; this approach is very awkward with immutable data structures
|
||||
;; being the default. MK.
|
||||
If the :_id key wasn't set on the document, it will be generated and merged into the returned map.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
;; returns the entire document with :_id generated
|
||||
(monger.collection/insert-and-return \"people\" {:name \"Joe\", :age 30})
|
||||
|
||||
(monger.collection/insert-and-return \"people\" {:name \"Joe\", :age 30, WriteConcern/SAFE})
|
||||
"
|
||||
([^String collection document]
|
||||
(insert-and-return ^DB monger.core/*mongodb-database* collection document ^WriteConcern monger.core/*mongodb-write-concern*))
|
||||
([^String collection document ^WriteConcern concern]
|
||||
(insert-and-return ^DB monger.core/*mongodb-database* collection document concern))
|
||||
([^DB db ^String collection document ^WriteConcern concern]
|
||||
;; MongoDB Java driver will generate the _id and set it but it tries to mutate the inserted DBObject
|
||||
;; and it does not work very well in our case, because that DBObject is short lived and produced
|
||||
;; from the Clojure map we are passing in. Plus, this approach is very awkward with immutable data
|
||||
;; structures being the default. MK.
|
||||
(let [doc (merge {:_id (ObjectId.)} document)]
|
||||
(insert db coll doc concern)
|
||||
(insert db collection doc concern)
|
||||
doc)))
|
||||
|
||||
|
||||
(defn ^WriteResult insert-batch
|
||||
"Saves documents to collection. You can optionally specify WriteConcern as a third argument."
|
||||
([^DB db ^String coll ^List documents]
|
||||
(.insert (.getCollection db (name coll))
|
||||
"Saves @documents@ do @collection@. You can optionally specify WriteConcern as a third argument.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
(monger.collection/insert-batch \"people\" [{:name \"Joe\", :age 30}, {:name \"Paul\", :age 27}])
|
||||
|
||||
(monger.collection/insert-batch \"people\" [{:name \"Joe\", :age 30}, {:name \"Paul\", :age 27}] WriteConcern/NORMAL)
|
||||
|
||||
"
|
||||
([^String collection ^List documents]
|
||||
(.insert (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
^List (to-db-object documents)
|
||||
^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll ^List documents ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name coll))
|
||||
^WriteConcern monger.core/*mongodb-write-concern*))
|
||||
([^String collection ^List documents ^WriteConcern concern]
|
||||
(.insert (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
^List (to-db-object documents)
|
||||
concern))
|
||||
([^DB db ^String collection ^List documents ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name collection))
|
||||
^List (to-db-object documents)
|
||||
concern)))
|
||||
|
||||
|
|
@ -126,67 +126,101 @@
|
|||
(defn ^DBCursor find
|
||||
"Queries for objects in this collection.
|
||||
This function returns DBCursor, which allows you to iterate over DBObjects.
|
||||
If you want to manipulate clojure sequences maps, use find-maps."
|
||||
([^DB db ^String coll]
|
||||
(.find (.getCollection db (name coll))))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(.find (.getCollection db (name coll))
|
||||
If you want to manipulate clojure sequences maps, please @find-maps@.
|
||||
|
||||
EXAMPLES:
|
||||
;; return all objects in this collection.
|
||||
(mgcol/find \"people\")
|
||||
|
||||
;; return all objects matching query
|
||||
(mgcol/find \"people\" {:company \"Comp Corp\"})
|
||||
|
||||
;; return all objects matching query, taking only specified fields
|
||||
(mgcol/find \"people\" {:company \"Comp Corp\"} [:first_name :last_name])
|
||||
"
|
||||
([^String collection]
|
||||
(.find (.getCollection monger.core/*mongodb-database* (name collection))))
|
||||
([^String collection ^Map ref]
|
||||
(.find (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object ref)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(.find (.getCollection db (name coll))
|
||||
([^String collection ^Map ref fields]
|
||||
(.find (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object ref)
|
||||
(as-field-selector fields)))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(.find (.getCollection db (name collection))
|
||||
(to-db-object ref)
|
||||
(as-field-selector fields))))
|
||||
|
||||
(defn find-maps
|
||||
"Queries for objects in this collection.
|
||||
This function returns clojure Seq of Maps.
|
||||
If you want to work directly with DBObject, use find."
|
||||
([^DB db ^String coll]
|
||||
(with-open [dbc (find db coll)]
|
||||
If you want to work directly with DBObject, use find.
|
||||
"
|
||||
([^String collection]
|
||||
(with-open [dbc (find collection)]
|
||||
(map (fn [x] (from-db-object x true)) dbc)))
|
||||
([^String collection ^Map ref]
|
||||
(with-open [dbc (find collection ref)]
|
||||
(map (fn [x] (from-db-object x true)) dbc)))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(with-open [dbc (find db coll ref)]
|
||||
([^String collection ^Map ref fields]
|
||||
(with-open [dbc (find collection ref fields)]
|
||||
(map (fn [x] (from-db-object x true)) dbc)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(find-maps db coll ref fields true))
|
||||
([^DB db ^String coll ^Map ref fields keywordize]
|
||||
(with-open [dbc (find db coll ref fields)]
|
||||
(map (fn [x] (from-db-object x keywordize)) dbc))))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(with-open [dbc (find db collection ref fields)]
|
||||
(map (fn [x] (from-db-object x true)) dbc))))
|
||||
|
||||
(defn find-seq
|
||||
"Queries for objects in this collection, returns ISeq of DBObjects."
|
||||
([^DB db ^String coll]
|
||||
(with-open [dbc (find db coll)]
|
||||
(seq dbc)))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(with-open [dbc (find db coll ref)]
|
||||
(seq dbc)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(with-open [dbc (find db coll ref fields)]
|
||||
(seq dbc))))
|
||||
([^String collection]
|
||||
(with-open [dbc (find collection)]
|
||||
(seq dbc)))
|
||||
([^String collection ^Map ref]
|
||||
(with-open [dbc (find collection ref)]
|
||||
(seq dbc)))
|
||||
([^String collection ^Map ref fields]
|
||||
(with-open [dbc (find collection ref fields)]
|
||||
(seq dbc)))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(with-open [dbc (find db collection ref fields)]
|
||||
(seq dbc))))
|
||||
|
||||
;;
|
||||
;; monger.collection/find-one
|
||||
;;
|
||||
|
||||
(defn ^DBObject find-one
|
||||
"Returns a single DBObject from this collection matching the query."
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(.findOne (.getCollection db (name coll))
|
||||
"Returns a single DBObject from this collection matching the query.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
(mgcol/find-one collection {:language \"Clojure\"})
|
||||
|
||||
;; Return only :language field.
|
||||
;; Note that _id field is always returned.
|
||||
(mgcol/find-one collection {:language \"Clojure\"} [:language])
|
||||
|
||||
"
|
||||
([^String collection ^Map ref]
|
||||
(.findOne (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object ref)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(.findOne (.getCollection db (name coll))
|
||||
([^String collection ^Map ref fields]
|
||||
(.findOne (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object ref)
|
||||
^DBObject (as-field-selector fields)))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(.findOne (.getCollection db (name collection))
|
||||
(to-db-object ref)
|
||||
^DBObject (as-field-selector fields))))
|
||||
|
||||
(defn ^IPersistentMap find-one-as-map
|
||||
"Returns a single object converted to Map from this collection matching the query."
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(from-db-object ^DBObject (find-one db coll ref) true))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(from-db-object ^DBObject (find-one db coll ref fields) true))
|
||||
([^DB db ^String coll ^Map ref fields keywordize]
|
||||
(from-db-object ^DBObject (find-one db coll ref fields) keywordize)))
|
||||
([^String collection ^Map ref]
|
||||
(from-db-object ^DBObject (find-one collection ref) true))
|
||||
([^String collection ^Map ref fields]
|
||||
(from-db-object ^DBObject (find-one collection ref fields) true))
|
||||
([^String collection ^Map ref fields keywordize]
|
||||
(from-db-object ^DBObject (find-one collection ref fields) keywordize)))
|
||||
|
||||
|
||||
;;
|
||||
|
|
@ -194,15 +228,32 @@
|
|||
;;
|
||||
|
||||
(defn ^IPersistentMap find-and-modify
|
||||
"Atomically modify a document (at most one) and return it."
|
||||
([^DB db ^String coll ^Map conditions ^Map document {:keys [fields sort remove return-new upsert keywordize] :or
|
||||
{fields nil
|
||||
sort nil
|
||||
remove false
|
||||
return-new false
|
||||
upsert false
|
||||
keywordize true}}]
|
||||
(let [coll (.getCollection db (name coll))
|
||||
"Atomically modify a document (at most one) and return it.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
;; Find and modify a document
|
||||
(mgcol/find-and-modify collection {:language \"Python\"} {:language \"Clojure\"})
|
||||
|
||||
;; If multiple documents match, choose the first one in the specified order
|
||||
(mgcol/find-and-modify collection {:language \"Python\"} {:language \"Clojure\"} :sort {:language -1})
|
||||
|
||||
;; Remove the object before returning
|
||||
(mgcol/find-and-modify collection {:language \"Python\"} {} :remove true)
|
||||
|
||||
;; Return the modified object instead of the old one
|
||||
(mgcol/find-and-modify collection {:language \"Python\"} {:language \"Clojure\"} :return-new true)
|
||||
|
||||
;; Retrieve a subset of fields
|
||||
(mgcol/find-and-modify collection {:language \"Python\"} {:language \"Clojure\"} :fields [ :language ])
|
||||
|
||||
;; Create the object if it doesn't exist
|
||||
(mgcol/find-and-modify collection {:language \"Factor\"} {:language \"Clojure\"} :upsert true)
|
||||
|
||||
"
|
||||
([^String collection ^Map conditions ^Map document & {:keys [fields sort remove return-new upsert keywordize] :or
|
||||
{fields nil sort nil remove false return-new false upsert false keywordize true}}]
|
||||
(let [coll (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
maybe-fields (when fields (as-field-selector fields))
|
||||
maybe-sort (when sort (to-db-object sort))]
|
||||
(from-db-object
|
||||
|
|
@ -214,73 +265,133 @@
|
|||
;;
|
||||
|
||||
(defn ^DBObject find-by-id
|
||||
"Returns a single object with matching _id field."
|
||||
([^DB db ^String coll id]
|
||||
"Returns a single object with matching _id field.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
(mgcol/find-one-by-id collection (ObjectId. \"4ef45ab4744e9fd632640e2d\"))
|
||||
|
||||
;; Return only :language field.
|
||||
;; Note that _id field is always returned.
|
||||
(mgcol/find-one-by-id collection (ObjectId. \"4ef45ab4744e9fd632640e2d\") [:language])
|
||||
"
|
||||
([^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db coll {:_id id}))
|
||||
([^DB db ^String coll id fields]
|
||||
(find-one collection {:_id id}))
|
||||
([^String collection id fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db coll {:_id id} fields)))
|
||||
(find-one collection {:_id id} fields))
|
||||
([^DB db ^String collection id fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db collection {:_id id} fields)))
|
||||
|
||||
(defn ^IPersistentMap find-map-by-id
|
||||
"Returns a single object, converted to map with matching _id field."
|
||||
([^DB db ^String coll id]
|
||||
([^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one-as-map db coll {:_id id}))
|
||||
([^DB db ^String coll id fields]
|
||||
(from-db-object ^DBObject (find-one-as-map collection {:_id id}) true))
|
||||
([^String collection id fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one-as-map db coll {:_id id} fields))
|
||||
([^DB db ^String coll id fields keywordize]
|
||||
(from-db-object ^DBObject (find-one-as-map collection {:_id id} fields) true))
|
||||
([^String collection id fields keywordize]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one-as-map db coll {:_id id} fields keywordize)))
|
||||
(from-db-object ^DBObject (find-one-as-map collection {:_id id} fields) keywordize)))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/group
|
||||
;;
|
||||
|
||||
|
||||
;; TBD
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/count
|
||||
;;
|
||||
|
||||
(defn count
|
||||
"Returns the number of documents in this collection.
|
||||
|
||||
Takes optional conditions as an argument."
|
||||
(^long [^DB db ^String coll]
|
||||
(.count (.getCollection db (name coll))))
|
||||
(^long [^DB db ^String coll ^Map conditions]
|
||||
(.count (.getCollection db (name coll)) (to-db-object conditions))))
|
||||
Takes optional conditions as an argument.
|
||||
|
||||
(monger.collection/count collection)
|
||||
|
||||
(monger.collection/count collection {:first_name \"Paul\"})"
|
||||
(^long [^String collection]
|
||||
(.count (.getCollection monger.core/*mongodb-database* (name collection))))
|
||||
(^long [^String collection ^Map conditions]
|
||||
(.count (.getCollection monger.core/*mongodb-database* (name collection)) (to-db-object conditions)))
|
||||
(^long [^DB db ^String collection ^Map conditions]
|
||||
(.count (.getCollection db (name collection)) (to-db-object conditions))))
|
||||
|
||||
(defn any?
|
||||
"Whether the collection has any items at all, or items matching query."
|
||||
([^DB db ^String coll]
|
||||
(> (count db coll) 0))
|
||||
([^DB db ^String coll ^Map conditions]
|
||||
(> (count db coll conditions) 0)))
|
||||
"Whether the collection has any items at all, or items matching query.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
;; whether the collection has any items
|
||||
(mgcol/any? collection)
|
||||
|
||||
(mgcol/any? collection {:language \"Clojure\"}))
|
||||
"
|
||||
([^String collection]
|
||||
(> (count collection) 0))
|
||||
([^String collection ^Map conditions]
|
||||
(> (count collection conditions) 0))
|
||||
([^DB db ^String collection ^Map conditions]
|
||||
(> (count db collection conditions) 0)))
|
||||
|
||||
|
||||
(defn empty?
|
||||
"Whether the collection is empty."
|
||||
[^DB db ^String coll]
|
||||
(= (count db coll {}) 0))
|
||||
"Whether the collection is empty.
|
||||
|
||||
EXAMPLES:
|
||||
(mgcol/empty? \"things\")
|
||||
"
|
||||
([^String collection]
|
||||
(= (count collection) 0))
|
||||
([^DB db ^String collection]
|
||||
(= (count db collection {}) 0)))
|
||||
|
||||
;; monger.collection/update
|
||||
|
||||
(defn ^WriteResult update
|
||||
"Performs an update operation.
|
||||
|
||||
Please note that update is potentially destructive operation. It updates document with the given set
|
||||
emptying the fields not mentioned in the new document. In order to only change certain fields, use
|
||||
Please note that update is potentially destructive operation. It will update your document with the given set
|
||||
emptying the fields not mentioned in (^Map document). In order to only change certain fields, please use
|
||||
\"$set\".
|
||||
|
||||
You can use all the MongoDB modifier operations ($inc, $set, $unset, $push, $pushAll, $addToSet, $pop, $pull
|
||||
$pullAll, $rename, $bit) here as well.
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/update \"people\" {:first_name \"Raul\"} {\"$set\" {:first_name \"Paul\"}})
|
||||
|
||||
You can use all the Mongodb Modifier Operations ($inc, $set, $unset, $push, $pushAll, $addToSet, $pop, $pull
|
||||
$pullAll, $rename, $bit) here, as well
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/update \"people\" {:first_name \"Paul\"} {\"$set\" {:index 1}})
|
||||
(monger.collection/update \"people\" {:first_name \"Paul\"} {\"$inc\" {:index 5}})
|
||||
|
||||
(monger.collection/update \"people\" {:first_name \"Paul\"} {\"$unset\" {:years_on_stage 1}})
|
||||
|
||||
It also takes modifiers, such as :upsert and :multi.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
;; add :band field to all the records found in \"people\" collection, otherwise only the first matched record
|
||||
;; will be updated
|
||||
(monger.collection/update \"people\" {} {\"$set\" {:band \"The Beatles\"}} :multi true)
|
||||
|
||||
;; inserts the record if it did not exist in the collection
|
||||
(monger.collection/update \"people\" {:first_name \"Yoko\"} {:first_name \"Yoko\" :last_name \"Ono\"} :upsert true)
|
||||
|
||||
It also takes options, such as :upsert and :multi.
|
||||
By default :upsert and :multi are false."
|
||||
([^DB db ^String coll ^Map conditions ^Map document]
|
||||
(update db coll conditions document {}))
|
||||
([^DB db ^String coll ^Map conditions ^Map document {:keys [upsert multi write-concern]
|
||||
:or {upsert false
|
||||
multi false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(.update (.getCollection db (name coll))
|
||||
([^String collection ^Map conditions ^Map document & {:keys [upsert multi write-concern] :or {upsert false
|
||||
multi false
|
||||
write-concern monger.core/*mongodb-write-concern*}}]
|
||||
(.update (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object conditions)
|
||||
(to-db-object document)
|
||||
upsert
|
||||
|
|
@ -294,42 +405,21 @@
|
|||
sets :upsert to true.
|
||||
|
||||
See monger.collection/update documentation"
|
||||
([^DB db ^String coll ^Map conditions ^Map document]
|
||||
(upsert db coll conditions document {}))
|
||||
([^DB db ^String coll ^Map conditions ^Map document {:keys [multi write-concern]
|
||||
:or {multi false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(update db coll conditions document {:multi multi :write-concern write-concern :upsert true})))
|
||||
[^String collection ^Map conditions ^Map document & {:keys [multi write-concern] :or {multi false
|
||||
write-concern monger.core/*mongodb-write-concern*}}]
|
||||
(update collection conditions document :multi multi :write-concern write-concern :upsert true))
|
||||
|
||||
(defn ^WriteResult update-by-id
|
||||
"Update a document with given id"
|
||||
([^DB db ^String coll id ^Map document]
|
||||
(update-by-id db coll id document {}))
|
||||
([^DB db ^String coll id ^Map document {:keys [upsert write-concern]
|
||||
:or {upsert false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(.update (.getCollection db (name coll))
|
||||
(to-db-object {:_id id})
|
||||
(to-db-object document)
|
||||
upsert
|
||||
false
|
||||
write-concern)))
|
||||
|
||||
(defn ^WriteResult update-by-ids
|
||||
"Update documents by given ids"
|
||||
([^DB db ^String coll ids ^Map document]
|
||||
(update-by-ids db coll ids document {}))
|
||||
([^DB db ^String coll ids ^Map document {:keys [upsert write-concern]
|
||||
:or {upsert false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(check-not-nil! (seq ids) "ids must not be nil or empty")
|
||||
(.update (.getCollection db (name coll))
|
||||
(to-db-object {:_id {"$in" ids}})
|
||||
(to-db-object document)
|
||||
upsert
|
||||
true
|
||||
write-concern)))
|
||||
[^String collection id ^Map document & {:keys [upsert write-concern] :or {upsert false
|
||||
write-concern monger.core/*mongodb-write-concern*}}]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(.update (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object {:_id id})
|
||||
(to-db-object document)
|
||||
upsert
|
||||
false
|
||||
write-concern))
|
||||
|
||||
|
||||
;; monger.collection/save
|
||||
|
|
@ -341,13 +431,22 @@
|
|||
If the object is already in the database, it will be updated.
|
||||
|
||||
This function returns write result. If you want to get the exact persisted document back,
|
||||
use `save-and-return`."
|
||||
([^DB db ^String coll ^Map document]
|
||||
(.save (.getCollection db (name coll))
|
||||
use `save-and-return`.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/save \"people\" {:first_name \"Ian\" :last_name \"Gillan\"})
|
||||
"
|
||||
([^String collection ^Map document]
|
||||
(.save (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object document)
|
||||
mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll ^Map document ^WriteConcern write-concern]
|
||||
(.save (.getCollection db (name coll))
|
||||
monger.core/*mongodb-write-concern*))
|
||||
([^String collection ^Map document ^WriteConcern write-concern]
|
||||
(.save (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(to-db-object document)
|
||||
write-concern))
|
||||
([^DB db ^String collection ^Map document ^WriteConcern write-concern]
|
||||
(.save (.getCollection db (name collection))
|
||||
(to-db-object document)
|
||||
write-concern)))
|
||||
|
||||
|
|
@ -360,51 +459,76 @@
|
|||
This function returns the exact persisted document back, including the `:_id` key in
|
||||
case of an insert.
|
||||
|
||||
If you want to get write result back, use `save`."
|
||||
([^DB db ^String coll ^Map document]
|
||||
(save-and-return db coll document ^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll ^Map document ^WriteConcern write-concern]
|
||||
If you want to get write result back, use `save`.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/save-and-return \"people\" {:first_name \"Ian\" :last_name \"Gillan\"})
|
||||
"
|
||||
([^String collection ^Map document]
|
||||
(save-and-return ^DB monger.core/*mongodb-database* collection document ^WriteConcern monger.core/*mongodb-write-concern*))
|
||||
([^String collection ^Map document ^WriteConcern write-concern]
|
||||
(save-and-return ^DB monger.core/*mongodb-database* collection document write-concern))
|
||||
([^DB db ^String collection ^Map document ^WriteConcern write-concern]
|
||||
;; see the comment in insert-and-return. Here we additionally need to make sure to not scrap the :_id key if
|
||||
;; it is already present. MK.
|
||||
(let [doc (merge {:_id (ObjectId.)} document)]
|
||||
(save db coll doc write-concern)
|
||||
(save db collection doc write-concern)
|
||||
doc)))
|
||||
|
||||
|
||||
;; monger.collection/remove
|
||||
|
||||
(defn ^WriteResult remove
|
||||
"Removes objects from the database."
|
||||
([^DB db ^String coll]
|
||||
(.remove (.getCollection db (name coll)) (to-db-object {})))
|
||||
([^DB db ^String coll ^Map conditions]
|
||||
(.remove (.getCollection db (name coll)) (to-db-object conditions))))
|
||||
"Removes objects from the database.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/remove collection) ;; Removes all documents from DB
|
||||
|
||||
(monger.collection/remove collection {:language \"Clojure\"}) ;; Removes documents based on given query
|
||||
|
||||
"
|
||||
([^String collection]
|
||||
(.remove (.getCollection monger.core/*mongodb-database* (name collection)) (to-db-object {})))
|
||||
([^String collection ^Map conditions]
|
||||
(.remove (.getCollection monger.core/*mongodb-database* (name collection)) (to-db-object conditions)))
|
||||
([^DB db ^String collection ^Map conditions]
|
||||
(.remove (.getCollection db (name collection)) (to-db-object conditions))))
|
||||
|
||||
|
||||
(defn ^WriteResult remove-by-id
|
||||
"Removes a single document with given id"
|
||||
[^DB db ^String coll id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(let [coll (.getCollection db (name coll))]
|
||||
(.remove coll (to-db-object {:_id id}))))
|
||||
([^String collection id]
|
||||
(remove-by-id monger.core/*mongodb-database* collection id))
|
||||
([^DB db ^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(let [coll (.getCollection db (name collection))]
|
||||
(.remove coll (to-db-object {:_id id})))))
|
||||
|
||||
(defn purge-many
|
||||
"Purges (removes all documents from) multiple collections. Intended
|
||||
to be used in test environments."
|
||||
[^DB db xs]
|
||||
(doseq [coll xs]
|
||||
(remove db coll)))
|
||||
|
||||
;;
|
||||
;; monger.collection/create-index
|
||||
;;
|
||||
|
||||
(defn create-index
|
||||
"Forces creation of index on a set of fields, if one does not already exists."
|
||||
([^DB db ^String coll ^Map keys]
|
||||
(.createIndex (.getCollection db (name coll)) (as-field-selector keys)))
|
||||
([^DB db ^String coll ^Map keys ^Map options]
|
||||
(.createIndex (.getCollection db (name coll))
|
||||
"Forces creation of index on a set of fields, if one does not already exists.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
;; Will create an index on the \"language\" field
|
||||
(monger.collection/create-index collection {\"language\" 1})
|
||||
(monger.collection/create-index collection {\"language\" 1} {:unique true :name \"unique_language\"})
|
||||
|
||||
"
|
||||
([^String collection ^Map keys]
|
||||
(.createIndex (.getCollection monger.core/*mongodb-database* (name collection)) (as-field-selector keys)))
|
||||
([^String collection ^Map keys options]
|
||||
(.createIndex (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(as-field-selector keys)
|
||||
(to-db-object options)))
|
||||
([^DB db ^String collection ^Map keys ^Map options]
|
||||
(.createIndex (.getCollection db (name collection))
|
||||
(as-field-selector keys)
|
||||
(to-db-object options))))
|
||||
|
||||
|
|
@ -420,17 +544,26 @@
|
|||
Options are:
|
||||
|
||||
:unique (boolean) to create a unique index
|
||||
:name (string) to specify a custom index name and not rely on the generated one"
|
||||
([^DB db ^String coll ^Map keys]
|
||||
(.createIndex (.getCollection db (name coll)) (as-field-selector keys)))
|
||||
([^DB db ^String coll ^Map keys ^Map options]
|
||||
(.createIndex (.getCollection db (name coll))
|
||||
:name (string) to specify a custom index name and not rely on the generated one
|
||||
|
||||
EXAMPLES
|
||||
|
||||
;; create a regular index
|
||||
;; clojure.core/array-map produces an ordered map
|
||||
(monger.collection/ensure-index \"documents\" (array-map \"language\" 1))
|
||||
;; create a unique index
|
||||
(monger.collection/ensure-index \"pages\" (array-map :url 1) {:unique true})
|
||||
"
|
||||
([^String collection ^Map keys]
|
||||
(.ensureIndex (.getCollection monger.core/*mongodb-database* (name collection)) (as-field-selector keys)))
|
||||
([^String collection ^Map keys ^Map options]
|
||||
(.ensureIndex (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(as-field-selector keys)
|
||||
(to-db-object options)))
|
||||
([^DB db ^String coll ^Map keys ^String index-name unique?]
|
||||
(.createIndex (.getCollection db (name coll))
|
||||
([^String collection ^Map keys ^String name ^Boolean unique?]
|
||||
(.ensureIndex (.getCollection monger.core/*mongodb-database* (name collection))
|
||||
(as-field-selector keys)
|
||||
index-name
|
||||
name
|
||||
unique?)))
|
||||
|
||||
|
||||
|
|
@ -439,9 +572,15 @@
|
|||
;;
|
||||
|
||||
(defn indexes-on
|
||||
"Return a list of the indexes for this collection."
|
||||
[^DB db ^String coll]
|
||||
(from-db-object (.getIndexInfo (.getCollection db (name coll))) true))
|
||||
"Return a list of the indexes for this collection.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/indexes-on collection)
|
||||
|
||||
"
|
||||
[^String collection]
|
||||
(from-db-object (.getIndexInfo (.getCollection monger.core/*mongodb-database* (name collection))) true))
|
||||
|
||||
|
||||
;;
|
||||
|
|
@ -450,15 +589,17 @@
|
|||
|
||||
(defn drop-index
|
||||
"Drops an index from this collection."
|
||||
[^DB db ^String coll idx]
|
||||
(if (string? idx)
|
||||
(.dropIndex (.getCollection db (name coll)) ^String idx)
|
||||
(.dropIndex (.getCollection db (name coll)) (to-db-object idx))))
|
||||
([^String collection ^String idx-name]
|
||||
(.dropIndex (.getCollection monger.core/*mongodb-database* (name collection)) idx-name))
|
||||
([^DB db ^String collection ^String idx-name]
|
||||
(.dropIndex (.getCollection db (name collection)) idx-name)))
|
||||
|
||||
(defn drop-indexes
|
||||
"Drops all indixes from this collection."
|
||||
[^DB db ^String coll]
|
||||
(.dropIndexes (.getCollection db (name coll))))
|
||||
([^String collection]
|
||||
(.dropIndexes (.getCollection monger.core/*mongodb-database* (name collection))))
|
||||
([^DB db ^String collection]
|
||||
(.dropIndexes (.getCollection db (name collection)))))
|
||||
|
||||
|
||||
;;
|
||||
|
|
@ -467,9 +608,16 @@
|
|||
|
||||
|
||||
(defn exists?
|
||||
"Checks whether collection with certain name exists."
|
||||
([^DB db ^String coll]
|
||||
(.collectionExists db coll)))
|
||||
"Checks weather collection with certain name exists.
|
||||
|
||||
EXAMPLE:
|
||||
|
||||
(monger.collection/exists? \"coll\")
|
||||
"
|
||||
([^String collection]
|
||||
(.collectionExists monger.core/*mongodb-database* collection))
|
||||
([^DB db ^String collection]
|
||||
(.collectionExists db collection)))
|
||||
|
||||
(defn create
|
||||
"Creates a collection with a given name and options.
|
||||
|
|
@ -478,21 +626,43 @@
|
|||
|
||||
:capped (pass true to create a capped collection)
|
||||
:max (number of documents)
|
||||
:size (max allowed size of the collection, in bytes)"
|
||||
[^DB db ^String coll ^Map options]
|
||||
(.createCollection db coll (to-db-object options)))
|
||||
:size (max allowed size of the collection, in bytes)
|
||||
|
||||
EXAMPLE:
|
||||
|
||||
;; create a capped collection
|
||||
(monger.collection/create \"coll\" {:capped true :size 100000 :max 10})
|
||||
"
|
||||
([^String collection ^Map options]
|
||||
(.createCollection monger.core/*mongodb-database* collection (to-db-object options)))
|
||||
([^DB db ^String collection ^Map options]
|
||||
(.createCollection db collection (to-db-object options))))
|
||||
|
||||
(defn drop
|
||||
"Deletes collection from database."
|
||||
[^DB db ^String coll]
|
||||
(.drop (.getCollection db (name coll))))
|
||||
"Deletes collection from database.
|
||||
|
||||
EXAMPLE:
|
||||
|
||||
(monger.collection/drop \"collection-to-drop\")
|
||||
"
|
||||
([^String collection]
|
||||
(.drop (.getCollection monger.core/*mongodb-database* (name collection))))
|
||||
([^DB db ^String collection]
|
||||
(.drop (.getCollection db (name collection)))))
|
||||
|
||||
(defn rename
|
||||
"Renames collection."
|
||||
([^DB db ^String from, ^String to]
|
||||
(.rename (.getCollection db (name from)) (name to)))
|
||||
([^DB db ^String from ^String to drop-target?]
|
||||
(.rename (.getCollection db (name from)) (name to) drop-target?)))
|
||||
"Renames collection.
|
||||
|
||||
EXAMPLE:
|
||||
|
||||
(monger.collection/rename \"old_name\" \"new_name\")
|
||||
"
|
||||
([^String from, ^String to]
|
||||
(.rename (.getCollection monger.core/*mongodb-database* from) to))
|
||||
([^String from ^String to ^Boolean drop-target]
|
||||
(.rename (.getCollection monger.core/*mongodb-database* from) to drop-target))
|
||||
([^DB db ^String from ^String to ^Boolean drop-target]
|
||||
(.rename (.getCollection db from) to drop-target)))
|
||||
|
||||
;;
|
||||
;; Map/Reduce
|
||||
|
|
@ -500,11 +670,11 @@
|
|||
|
||||
(defn map-reduce
|
||||
"Performs a map reduce operation"
|
||||
([^DB db ^String coll ^String js-mapper ^String js-reducer ^String output ^Map query]
|
||||
(let [coll (.getCollection db (name coll))]
|
||||
([^String collection ^String js-mapper ^String js-reducer ^String output ^Map query]
|
||||
(let [coll (.getCollection monger.core/*mongodb-database* (name collection))]
|
||||
(.mapReduce coll js-mapper js-reducer output (to-db-object query))))
|
||||
([^DB db ^String coll ^String js-mapper ^String js-reducer ^String output ^MapReduceCommand$OutputType output-type ^Map query]
|
||||
(let [coll (.getCollection db (name coll))]
|
||||
([^String collection ^String js-mapper ^String js-reducer ^String output ^MapReduceCommand$OutputType output-type ^Map query]
|
||||
(let [coll (.getCollection monger.core/*mongodb-database* (name collection))]
|
||||
(.mapReduce coll js-mapper js-reducer output output-type (to-db-object query)))))
|
||||
|
||||
|
||||
|
|
@ -514,55 +684,29 @@
|
|||
|
||||
(defn distinct
|
||||
"Finds distinct values for a key"
|
||||
([^DB db ^String coll ^String key]
|
||||
(.distinct (.getCollection db (name coll)) ^String (to-db-object key)))
|
||||
([^DB db ^String coll ^String key ^Map query]
|
||||
(.distinct (.getCollection db (name coll)) ^String (to-db-object key) (to-db-object query))))
|
||||
([^String collection ^String key]
|
||||
(.distinct (.getCollection monger.core/*mongodb-database* (name collection)) ^String (to-db-object key)))
|
||||
([^String collection ^String key ^Map query]
|
||||
(.distinct (.getCollection monger.core/*mongodb-database* (name collection)) ^String (to-db-object key) (to-db-object query)))
|
||||
([^DB db ^String collection ^String key ^Map query]
|
||||
(.distinct (.getCollection db (name collection)) ^String (to-db-object key) (to-db-object query))))
|
||||
|
||||
|
||||
;;
|
||||
;; Aggregation
|
||||
;;
|
||||
|
||||
(defn- build-aggregation-options
|
||||
^AggregationOptions
|
||||
[{:keys [^Boolean allow-disk-use cursor ^Long max-time]}]
|
||||
(cond-> (AggregationOptions/builder)
|
||||
allow-disk-use (.allowDiskUse allow-disk-use)
|
||||
cursor (.outputMode AggregationOptions$OutputMode/CURSOR)
|
||||
max-time (.maxTime max-time TimeUnit/MILLISECONDS)
|
||||
(:batch-size cursor) (.batchSize (int (:batch-size cursor)))
|
||||
true .build))
|
||||
|
||||
(defn aggregate
|
||||
"Executes an aggregation query. MongoDB 2.2+ only.
|
||||
Accepts the options :allow-disk-use and :cursor (a map with the :batch-size
|
||||
key), as described in the MongoDB manual. Additionally, the :max-time option
|
||||
is supported, for specifying a limit on the execution time of the query in
|
||||
milliseconds.
|
||||
|
||||
:keywordize option that control if resulting map keys will be turned into keywords, default is true.
|
||||
"Performs aggregation query. MongoDB 2.1/2.2+ only.
|
||||
|
||||
See http://docs.mongodb.org/manual/applications/aggregation/ to learn more."
|
||||
[^DB db ^String coll stages & opts]
|
||||
(let [coll (.getCollection db (name coll))
|
||||
agg-opts (build-aggregation-options opts)
|
||||
pipe (into-array-list (to-db-object stages))
|
||||
res (.aggregate coll pipe agg-opts)
|
||||
{:keys [^Boolean keywordize]
|
||||
:or {keywordize true}} opts]
|
||||
(map #(from-db-object % keywordize) (iterator-seq res))))
|
||||
[^String collection stages]
|
||||
(let [res (monger.core/command {:aggregate collection :pipeline stages})]
|
||||
;; this is what DBCollection#distinct does. Turning a blind eye!
|
||||
(.throwOnError res)
|
||||
(map #(from-db-object % true) (.get res "result"))))
|
||||
|
||||
(defn explain-aggregate
|
||||
"Returns the explain plan for an aggregation query. MongoDB 2.2+ only.
|
||||
|
||||
See http://docs.mongodb.org/manual/applications/aggregation/ to learn more."
|
||||
[^DB db ^String coll stages & opts]
|
||||
(let [coll (.getCollection db (name coll))
|
||||
agg-opts (build-aggregation-options opts)
|
||||
pipe (into-array-list (to-db-object stages))
|
||||
res (.explainAggregate coll pipe agg-opts)]
|
||||
(from-db-object res true)))
|
||||
;;
|
||||
;; Misc
|
||||
;;
|
||||
|
|
@ -573,5 +717,5 @@
|
|||
(defn system-collection?
|
||||
"Evaluates to true if the given collection name refers to a system collection. System collections
|
||||
are prefixed with system. or fs. (default GridFS collection prefix)"
|
||||
[^String coll]
|
||||
(re-find system-collection-pattern coll))
|
||||
[^String collection]
|
||||
(re-find system-collection-pattern collection))
|
||||
|
|
|
|||
|
|
@ -1,51 +1,26 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.command
|
||||
"Provides convenience functions for performing most commonly used MongoDB commands.
|
||||
For a lower-level API that gives maximum flexibility, see `monger.core/command`. To use
|
||||
MongoDB 2.2 Aggregation Framework, see `monger.collection/aggregate`.
|
||||
(ns ^{:doc "Provides convenience functions for performing most commonly used MongoDB commands.
|
||||
For a lower-level API that gives maximum flexibility, see `monger.core/command`. To use
|
||||
MongoDB 2.2 Aggregation Framework, see `monger.collection/aggregate`.
|
||||
|
||||
Related documentation guides:
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/aggregation.html
|
||||
* http://clojuremongodb.info/articles/mapreduce.html"
|
||||
(:require monger.core
|
||||
[monger.conversion :refer :all])
|
||||
(:import [com.mongodb MongoClient DB DBObject]))
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/aggregation.html
|
||||
* http://clojuremongodb.info/articles/mapreduce.html"}
|
||||
monger.command
|
||||
(:require monger.core)
|
||||
(:use monger.conversion)
|
||||
(:import com.mongodb.DB))
|
||||
|
||||
|
||||
;;
|
||||
|
|
@ -54,55 +29,72 @@
|
|||
|
||||
(defn admin-command
|
||||
"Executes a command on the admin database"
|
||||
[^MongoClient conn m]
|
||||
(monger.core/command (monger.core/admin-db conn) m))
|
||||
|
||||
(defn raw-admin-command
|
||||
"Executes a command on the admin database"
|
||||
[^MongoClient conn ^DBObject cmd]
|
||||
(monger.core/raw-command (monger.core/admin-db conn) cmd))
|
||||
[m]
|
||||
(monger.core/command (monger.core/admin-db) m))
|
||||
|
||||
(defn collection-stats
|
||||
[^DB database collection]
|
||||
(monger.core/command database {:collstats collection}))
|
||||
([collection]
|
||||
(collection-stats monger.core/*mongodb-database* collection))
|
||||
([^DB database collection]
|
||||
(monger.core/command database {:collstats collection})))
|
||||
|
||||
(defn db-stats
|
||||
[^DB database]
|
||||
(monger.core/command database {:dbStats 1}))
|
||||
([]
|
||||
(db-stats monger.core/*mongodb-database*))
|
||||
([^DB database]
|
||||
(monger.core/command database {:dbStats 1})))
|
||||
|
||||
|
||||
(defn reindex-collection
|
||||
"Forces an existing collection to be reindexed using the reindexCollection command"
|
||||
[^DB database ^String collection]
|
||||
(monger.core/command database {:reIndex collection}))
|
||||
([^String collection]
|
||||
(reindex-collection monger.core/*mongodb-database* collection))
|
||||
([^DB database ^String collection]
|
||||
(monger.core/command database {:reIndex collection})))
|
||||
|
||||
(defn rename-collection
|
||||
"Changes the name of an existing collection using the renameCollection command"
|
||||
[^DB db ^String from ^String to]
|
||||
(monger.core/command db (sorted-map :renameCollection from :to to)))
|
||||
([^String from ^String to]
|
||||
(reindex-collection monger.core/*mongodb-database* from to))
|
||||
([^DB database ^String from ^String to]
|
||||
(monger.core/command database (sorted-map :renameCollection from :to to))))
|
||||
|
||||
(defn convert-to-capped
|
||||
"Converts an existing, non-capped collection to a capped collection using the convertToCapped command"
|
||||
[^DB db ^String collection ^long size]
|
||||
(monger.core/command db (sorted-map :convertToCapped collection :size size)))
|
||||
([^String collection ^long size]
|
||||
(convert-to-capped monger.core/*mongodb-database* collection size))
|
||||
([^DB database ^String collection ^long size]
|
||||
(monger.core/command database (sorted-map :convertToCapped collection :size size))))
|
||||
|
||||
(defn empty-capped
|
||||
"Removes all documents from a capped collection using the emptycapped command"
|
||||
[^DB db ^String collection]
|
||||
(monger.core/command db {:emptycapped collection}))
|
||||
([^String collection]
|
||||
(empty-capped monger.core/*mongodb-database* collection))
|
||||
([^DB database ^String collection]
|
||||
(monger.core/command database {:emptycapped collection})))
|
||||
|
||||
|
||||
(defn compact
|
||||
"Rewrites and defragments a single collection using the compact command. This also forces all indexes on the collection to be rebuilt"
|
||||
[^DB db ^String collection]
|
||||
(monger.core/command db {:compact collection}))
|
||||
([^String collection]
|
||||
(compact monger.core/*mongodb-database* collection))
|
||||
([^DB database ^String collection]
|
||||
(monger.core/command database {:compact collection})))
|
||||
|
||||
|
||||
(defn server-status
|
||||
[^DB db]
|
||||
(monger.core/command db {:serverStatus 1}))
|
||||
([]
|
||||
(server-status monger.core/*mongodb-database*))
|
||||
([^DB database]
|
||||
(monger.core/command database {:serverStatus 1})))
|
||||
|
||||
|
||||
(defn top
|
||||
[^MongoClient conn]
|
||||
(monger.core/command (monger.core/admin-db conn) {:top 1}))
|
||||
[]
|
||||
(monger.core/command (monger.core/get-db "admin") {:top 1}))
|
||||
|
||||
(defn search
|
||||
([^String collection query]
|
||||
(monger.core/command {"text" collection "search" query}))
|
||||
([^DB database ^String collection query]
|
||||
(monger.core/command database {"text" collection "search" query})))
|
||||
|
|
@ -1,36 +1,3 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.constraints)
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,53 +1,40 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Original author is Andrew Boekhoff
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Portions of the code are Copyright (c) 2009 Andrew Boekhoff
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;; Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
;; of this software and associated documentation files (the "Software"), to deal
|
||||
;; in the Software without restriction, including without limitation the rights
|
||||
;; to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
;; copies of the Software, and to permit persons to whom the Software is
|
||||
;; furnished to do so, subject to the following conditions:
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;; The above copyright notice and this permission notice shall be included in
|
||||
;; all copies or substantial portions of the Software.
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Portions of the code are Copyright (c) 2009 Andrew Boekhoff
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
;; IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
;; FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
;; AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
;; LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
;; OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
;; THE SOFTWARE.
|
||||
|
||||
(ns monger.conversion
|
||||
"Provides functions that convert between MongoDB Java driver classes (DBObject, DBList) and Clojure
|
||||
data structures (maps, collections). Most of the time, application developers won't need to use these
|
||||
functions directly because Monger Query DSL and many other functions convert documents to Clojure sequences and
|
||||
maps automatically. However, this namespace is part of the public API and guaranteed to be stable between minor releases.
|
||||
(ns ^{:doc "Provides functions that convert between MongoDB Java driver classes (DBObject, DBList) and Clojure
|
||||
data structures (maps, collections). Most of the time, application developers won't need to use these
|
||||
functions directly because Monger Query DSL and many other functions convert documents to Clojure sequences and
|
||||
maps automatically. However, this namespace is part of the public API and guaranteed to be stable between minor releases.
|
||||
|
||||
Related documentation guides:
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/inserting.html
|
||||
* http://clojuremongodb.info/articles/querying.html"
|
||||
* http://clojuremongodb.info/articles/inserting.html
|
||||
* http://clojuremongodb.info/articles/querying.html"}
|
||||
monger.conversion
|
||||
(:import [com.mongodb DBObject BasicDBObject BasicDBList DBCursor]
|
||||
[clojure.lang IPersistentMap Named Keyword Ratio]
|
||||
[java.util List Map Date Set]
|
||||
org.bson.types.ObjectId
|
||||
(org.bson.types Decimal128)))
|
||||
org.bson.types.ObjectId))
|
||||
|
||||
(defprotocol ConvertToDBObject
|
||||
(^com.mongodb.DBObject to-db-object [input] "Converts given piece of Clojure data to BasicDBObject MongoDB Java driver uses"))
|
||||
|
|
@ -95,8 +82,8 @@
|
|||
DBObject
|
||||
(to-db-object [^DBObject input] input)
|
||||
|
||||
com.mongodb.DBRef
|
||||
(to-db-object [^com.mongodb.DBRef dbref]
|
||||
com.novemberain.monger.DBRef
|
||||
(to-db-object [^com.novemberain.monger.DBRef dbref]
|
||||
dbref)
|
||||
|
||||
Object
|
||||
|
|
@ -105,44 +92,55 @@
|
|||
|
||||
|
||||
|
||||
|
||||
(declare associate-pairs)
|
||||
(defprotocol ConvertFromDBObject
|
||||
(from-db-object [input keywordize] "Converts given DBObject instance to a piece of Clojure data"))
|
||||
|
||||
(extend-protocol ConvertFromDBObject
|
||||
nil
|
||||
(from-db-object [_ _] nil)
|
||||
(from-db-object [input keywordize] input)
|
||||
|
||||
Object
|
||||
(from-db-object [input _] input)
|
||||
(from-db-object [input keywordize] input)
|
||||
|
||||
Decimal128
|
||||
(from-db-object [^Decimal128 input _]
|
||||
(.bigDecimalValue input))
|
||||
Map
|
||||
(from-db-object [^Map input keywordize]
|
||||
(associate-pairs (.entrySet input) keywordize))
|
||||
|
||||
List
|
||||
(from-db-object [^List input keywordize]
|
||||
(mapv #(from-db-object % keywordize) input))
|
||||
(vec (map #(from-db-object % keywordize) input)))
|
||||
|
||||
BasicDBList
|
||||
(from-db-object [^BasicDBList input keywordize]
|
||||
(mapv #(from-db-object % keywordize) input))
|
||||
(vec (map #(from-db-object % keywordize) input)))
|
||||
|
||||
com.mongodb.DBRef
|
||||
(from-db-object [^com.mongodb.DBRef input _]
|
||||
input)
|
||||
(from-db-object [^com.mongodb.DBRef input keywordize]
|
||||
(com.novemberain.monger.DBRef. input))
|
||||
|
||||
DBObject
|
||||
(from-db-object [^DBObject input keywordize]
|
||||
;; DBObject provides .toMap, but the implementation in
|
||||
;; subclass GridFSFile unhelpfully throws
|
||||
;; UnsupportedOperationException.
|
||||
(persistent!
|
||||
(reduce (if keywordize
|
||||
(fn [m ^String k]
|
||||
(assoc! m (keyword k) (from-db-object (.get input k) true)))
|
||||
(fn [m ^String k]
|
||||
(assoc! m k (from-db-object (.get input k) false))))
|
||||
(transient {}) (.keySet input)))))
|
||||
;; UnsupportedOperationException. This part is taken from congomongo and
|
||||
;; may need revisiting at a later point. MK.
|
||||
(associate-pairs (for [key-set (.keySet input)] [key-set (.get input key-set)])
|
||||
keywordize)))
|
||||
|
||||
|
||||
(defn- associate-pairs [pairs keywordize]
|
||||
;; Taking the keywordize test out of the fn reduces derefs
|
||||
;; dramatically, which was the main barrier to matching pure-Java
|
||||
;; performance for this marshalling. Taken from congomongo. MK.
|
||||
(reduce (if keywordize
|
||||
(fn [m [^String k v]]
|
||||
(assoc m (keyword k) (from-db-object v true)))
|
||||
(fn [m [^String k v]]
|
||||
(assoc m k (from-db-object v false))))
|
||||
{} (reverse pairs)))
|
||||
|
||||
|
||||
|
||||
(defprotocol ConvertToObjectId
|
||||
|
|
|
|||
|
|
@ -1,53 +1,29 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.core
|
||||
"Thin idiomatic wrapper around MongoDB Java client. monger.core includes
|
||||
fundamental functions that perform database/replica set connection, set default write concern, default database, performing commands
|
||||
and so on. Most of the functionality is in other monger.* namespaces, in particular monger.collection, monger.query and monger.gridfs
|
||||
(ns ^{:author "Michael S. Klishin"
|
||||
:doc "Thin idiomatic wrapper around MongoDB Java client. monger.core includes
|
||||
fundamental functions that perform database/replica set connection, set default write concern, default database, performing commands
|
||||
and so on. Most of the functionality is in other monger.* namespaces, in particular monger.collection, monger.query and monger.gridfs
|
||||
|
||||
Related documentation guides:
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/connecting.html
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/gridfs.html"
|
||||
* http://clojuremongodb.info/articles/connecting.html
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/gridfs.html"}
|
||||
monger.core
|
||||
(:refer-clojure :exclude [count])
|
||||
(:require [monger.conversion :refer :all]
|
||||
[monger.util :refer [into-array-list]])
|
||||
(:import [com.mongodb MongoClient MongoClientURI MongoCredential DB WriteConcern DBObject DBCursor Bytes
|
||||
MongoClientOptions MongoClientOptions$Builder ServerAddress MapReduceOutput MongoException]
|
||||
(:use monger.conversion
|
||||
[monger.result :only [ok?]])
|
||||
(:import [com.mongodb MongoClient MongoClientURI DB WriteConcern DBObject DBCursor Bytes MongoClientOptions MongoClientOptions$Builder ServerAddress MapReduceOutput MongoException]
|
||||
[com.mongodb.gridfs GridFS]
|
||||
[java.util Map]))
|
||||
[java.util Map ArrayList]))
|
||||
|
||||
;;
|
||||
;; Defaults
|
||||
|
|
@ -56,78 +32,103 @@
|
|||
(def ^:dynamic ^String *mongodb-host* "127.0.0.1")
|
||||
(def ^:dynamic ^long *mongodb-port* 27017)
|
||||
|
||||
(def ^:dynamic ^WriteConcern *mongodb-write-concern* WriteConcern/ACKNOWLEDGED)
|
||||
(declare ^:dynamic ^MongoClient *mongodb-connection*)
|
||||
(declare ^:dynamic ^DB *mongodb-database*)
|
||||
(def ^:dynamic ^WriteConcern *mongodb-write-concern* WriteConcern/SAFE)
|
||||
|
||||
(declare ^:dynamic ^GridFS *mongodb-gridfs*)
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^MongoClient connect
|
||||
(defn ^com.mongodb.MongoClient connect
|
||||
"Connects to MongoDB. When used without arguments, connects to
|
||||
|
||||
Arguments:
|
||||
:host (\"127.0.0.1\" by default)
|
||||
:port (27017 by default)"
|
||||
:host (*mongodb-host* by default)
|
||||
:port (*mongodb-port* by default)
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.core/connect)
|
||||
(monger.core/connect { :host \"db3.intranet.local\", :port 27787 })
|
||||
|
||||
;; Connecting to a replica set with a couple of seeds
|
||||
(let [^MongoClientOptions opts (mg/mongo-options :threads-allowed-to-block-for-connection-multiplier 300)
|
||||
seeds [[\"192.168.1.1\" 27017] [\"192.168.1.2\" 27017] [\"192.168.1.1\" 27018]]
|
||||
sas (map #(apply mg/server-address %) seeds)]
|
||||
(mg/connect! sas opts))
|
||||
"
|
||||
{:arglists '([]
|
||||
[server-address options]
|
||||
[server-address options credentials]
|
||||
[[server-address & more] options]
|
||||
[{:keys [host port uri] :or { host *mongodb-host* port *mongodb-port*}}])}
|
||||
[server-address options]
|
||||
[[server-address & more] options]
|
||||
[{ :keys [host port uri] :or { host *mongodb-host* port *mongodb-port* }}])}
|
||||
([]
|
||||
(MongoClient.))
|
||||
([server-address ^MongoClientOptions options]
|
||||
(if (coll? server-address)
|
||||
;; connect to a replica set
|
||||
(let [server-list (into-array-list server-address)]
|
||||
(let [server-list ^ArrayList (ArrayList. ^java.util.Collection server-address)]
|
||||
(MongoClient. server-list options))
|
||||
;; connect to a single instance
|
||||
(MongoClient. ^ServerAddress server-address options)))
|
||||
([server-address ^MongoClientOptions options credentials]
|
||||
(let [creds (into-array-list (if (coll? credentials)
|
||||
credentials
|
||||
[credentials]))]
|
||||
(if (coll? server-address)
|
||||
(let [server-list (into-array-list server-address)]
|
||||
(MongoClient. server-list ^java.util.List creds options))
|
||||
(MongoClient. ^ServerAddress server-address ^java.util.List creds options))))
|
||||
([{ :keys [host port uri] :or { host *mongodb-host* port *mongodb-port* }}]
|
||||
(if uri
|
||||
(MongoClient. (MongoClientURI. uri))
|
||||
(MongoClient. ^String host ^Long port))))
|
||||
(MongoClient. ^String host ^Long port)))
|
||||
|
||||
(defn ^MongoClient connect-with-credentials
|
||||
"Connect with provided credentials and default options"
|
||||
([credentials]
|
||||
(connect-with-credentials *mongodb-host* *mongodb-port* credentials))
|
||||
([^String hostname credentials]
|
||||
(connect-with-credentials hostname *mongodb-port* credentials))
|
||||
([^String hostname ^long port credentials]
|
||||
(MongoClient. (into-array-list [(ServerAddress. hostname port)])
|
||||
(into-array-list (if (coll? credentials)
|
||||
credentials
|
||||
[credentials])))))
|
||||
|
||||
(defn get-db-names
|
||||
"Gets a list of all database names present on the server"
|
||||
[^MongoClient conn]
|
||||
(set (.getDatabaseNames conn)))
|
||||
([]
|
||||
(get-db-names *mongodb-connection*))
|
||||
([^MongoClient connection]
|
||||
(set (.getDatabaseNames connection))))
|
||||
|
||||
|
||||
(defn ^DB get-db
|
||||
"Get database reference by name."
|
||||
[^MongoClient conn ^String name]
|
||||
(.getDB conn name))
|
||||
(defn ^com.mongodb.DB get-db
|
||||
"Get database reference by name.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.core/get-db \"myapp_production\")
|
||||
(monger.core/get-db connection \"myapp_production\")"
|
||||
([]
|
||||
*mongodb-database*)
|
||||
([^String name]
|
||||
(.getDB *mongodb-connection* name))
|
||||
([^MongoClient connection ^String name]
|
||||
(.getDB connection name)))
|
||||
|
||||
(defn ^com.mongodb.DB current-db
|
||||
"Returns currently used database"
|
||||
[]
|
||||
*mongodb-database*)
|
||||
|
||||
(defn drop-db
|
||||
"Drops a database"
|
||||
[^MongoClient conn ^String db]
|
||||
(.dropDatabase conn db))
|
||||
([^String db]
|
||||
(.dropDatabase *mongodb-connection* db))
|
||||
([^MongoClient conn ^String db]
|
||||
(.dropDatabase conn db)))
|
||||
|
||||
|
||||
(defmacro with-connection
|
||||
[conn & body]
|
||||
`(binding [*mongodb-connection* ~conn]
|
||||
(do ~@body)))
|
||||
|
||||
|
||||
(defmacro with-db
|
||||
[db & body]
|
||||
`(binding [*mongodb-database* ~db]
|
||||
(do ~@body)))
|
||||
|
||||
(defmacro with-gridfs
|
||||
[fs & body]
|
||||
`(binding [*mongodb-gridfs* ~fs]
|
||||
(do ~@body)))
|
||||
|
||||
(defn ^GridFS get-gridfs
|
||||
"Get GridFS for the given database."
|
||||
[^MongoClient conn ^String name]
|
||||
(GridFS. (.getDB conn name)))
|
||||
|
||||
(defn server-address
|
||||
([^String hostname]
|
||||
|
|
@ -135,165 +136,125 @@
|
|||
([^String hostname ^Long port]
|
||||
(ServerAddress. hostname port)))
|
||||
|
||||
(defn ^MongoClientOptions$Builder mongo-options-builder
|
||||
[{:keys [add-cluster-listener add-cluster-listeners add-command-listener add-command-listeners
|
||||
add-connection-pool-listener add-connection-pool-listeners add-server-listener add-server-listeners
|
||||
add-server-monitor-listener add-server-monitor-listeners always-use-mbeans application-name
|
||||
codec-registry compressor-list connect-timeout connections-per-host cursor-finalizer-enabled
|
||||
db-decoder-factory db-encoder-factory description heartbeat-connect-timeout heartbeat-frequency
|
||||
heartbeat-socket-timeout local-threshold max-connection-idle-time max-connection-life-time
|
||||
max-wait-time min-connections-per-host min-heartbeat-frequency read-concern read-preference
|
||||
required-replica-set-name retry-writes server-selection-timeout server-selector socket-keep-alive
|
||||
socket-factory socket-timeout ssl-context ssl-enabled ssl-invalid-host-name-allowed
|
||||
threads-allowed-to-block-for-connection-multiplier uuid-representation write-concern]}]
|
||||
|
||||
(defn mongo-options
|
||||
[& { :keys [connections-per-host threads-allowed-to-block-for-connection-multiplier
|
||||
max-wait-time connect-timeout socket-timeout socket-keep-alive auto-connect-retry max-auto-connect-retry-time
|
||||
description write-concern cursor-finalizer-enabled] :or [auto-connect-retry true] }]
|
||||
(let [mob (MongoClientOptions$Builder.)]
|
||||
(when add-cluster-listener
|
||||
(.addClusterListener mob add-cluster-listener))
|
||||
(when add-cluster-listeners
|
||||
(doseq [cluster-listener add-cluster-listeners]
|
||||
(.addClusterListener mob cluster-listener)))
|
||||
(when add-command-listener
|
||||
(.addCommandListener mob add-command-listener))
|
||||
(when add-command-listeners
|
||||
(doseq [command-listener add-command-listeners]
|
||||
(.addCommandListener mob command-listener)))
|
||||
(when add-connection-pool-listener
|
||||
(.addConnectionPoolListener mob add-connection-pool-listener))
|
||||
(when add-connection-pool-listeners
|
||||
(doseq [connection-pool-listener add-connection-pool-listeners]
|
||||
(.addConnectionPoolListener mob connection-pool-listener)))
|
||||
(when add-server-listener
|
||||
(.addServerListener mob add-server-listener))
|
||||
(when add-server-listeners
|
||||
(doseq [server-listener add-server-listeners]
|
||||
(.addServerListener mob server-listener)))
|
||||
(when add-server-monitor-listener
|
||||
(.addServerMonitorListener mob add-server-monitor-listener))
|
||||
(when add-server-monitor-listeners
|
||||
(doseq [server-monitor-listener add-server-monitor-listeners]
|
||||
(.addServerMonitorListener mob server-monitor-listener)))
|
||||
(when always-use-mbeans
|
||||
(.alwaysUseMBeans mob always-use-mbeans))
|
||||
(when application-name
|
||||
(.applicationName mob application-name))
|
||||
(when always-use-mbeans
|
||||
(.alwaysUseMBeans mob always-use-mbeans))
|
||||
(when codec-registry
|
||||
(.codecRegistry mob codec-registry))
|
||||
(when compressor-list
|
||||
(.compressorList mob compressor-list))
|
||||
(when connections-per-host
|
||||
(.connectionsPerHost mob connections-per-host))
|
||||
(when connect-timeout
|
||||
(.connectTimeout mob connect-timeout))
|
||||
(when cursor-finalizer-enabled
|
||||
(.cursorFinalizerEnabled mob cursor-finalizer-enabled))
|
||||
(when db-decoder-factory
|
||||
(.dbDecoderFactory mob db-decoder-factory))
|
||||
(when db-encoder-factory
|
||||
(.dbEncoderFactory mob db-encoder-factory))
|
||||
(when description
|
||||
(.description mob description))
|
||||
(when heartbeat-connect-timeout
|
||||
(.heartbeatConnectTimeout mob heartbeat-connect-timeout))
|
||||
(when heartbeat-frequency
|
||||
(.heartbeatFrequency mob heartbeat-frequency))
|
||||
(when heartbeat-socket-timeout
|
||||
(.heartbeatSocketTimeout mob heartbeat-socket-timeout))
|
||||
(when ssl-context
|
||||
(.sslContext mob ssl-context))
|
||||
(when local-threshold
|
||||
(.localThreshold mob local-threshold))
|
||||
(when max-connection-idle-time
|
||||
(.maxConnectionIdleTime mob max-connection-idle-time))
|
||||
(when max-wait-time
|
||||
(.maxWaitTime mob max-wait-time))
|
||||
(when max-connection-life-time
|
||||
(.maxConnectionLifeTime mob max-connection-life-time))
|
||||
(when min-connections-per-host
|
||||
(.minConnectionsPerHost mob min-connections-per-host))
|
||||
(when min-heartbeat-frequency
|
||||
(.minHeartbeatFrequency mob min-heartbeat-frequency))
|
||||
(when read-concern
|
||||
(.readConcern mob read-concern))
|
||||
(when read-preference
|
||||
(.readPreference mob read-preference))
|
||||
(when required-replica-set-name
|
||||
(.requiredReplicaSetName mob required-replica-set-name))
|
||||
(when retry-writes
|
||||
(.retryWrites mob retry-writes))
|
||||
(when server-selection-timeout
|
||||
(.serverSelectionTimeout mob server-selection-timeout))
|
||||
(when server-selector
|
||||
(.serverSelector mob server-selector))
|
||||
(when socket-keep-alive
|
||||
(.socketKeepAlive mob socket-keep-alive))
|
||||
(when socket-factory
|
||||
(.socketFactory mob socket-factory))
|
||||
(when socket-timeout
|
||||
(.socketTimeout mob socket-timeout))
|
||||
(when ssl-enabled
|
||||
(.sslEnabled mob ssl-enabled))
|
||||
(when ssl-invalid-host-name-allowed
|
||||
(.sslInvalidHostNameAllowed mob ssl-invalid-host-name-allowed))
|
||||
(when threads-allowed-to-block-for-connection-multiplier
|
||||
(.threadsAllowedToBlockForConnectionMultiplier mob threads-allowed-to-block-for-connection-multiplier))
|
||||
(when uuid-representation
|
||||
(.uuidRepresentation mob uuid-representation))
|
||||
(when max-wait-time
|
||||
(.maxWaitTime mob max-wait-time))
|
||||
(when connect-timeout
|
||||
(.connectTimeout mob connect-timeout))
|
||||
(when socket-timeout
|
||||
(.socketTimeout mob socket-timeout))
|
||||
(when socket-keep-alive
|
||||
(.socketKeepAlive mob socket-keep-alive))
|
||||
(when auto-connect-retry
|
||||
(.autoConnectRetry mob auto-connect-retry))
|
||||
(when max-auto-connect-retry-time
|
||||
(.maxAutoConnectRetryTime mob max-auto-connect-retry-time))
|
||||
(when description
|
||||
(.description mob description))
|
||||
(when write-concern
|
||||
(.writeConcern mob write-concern))
|
||||
mob))
|
||||
|
||||
(defn ^MongoClientOptions mongo-options
|
||||
[opts]
|
||||
(let [mob (mongo-options-builder opts)]
|
||||
(when cursor-finalizer-enabled
|
||||
(.cursorFinalizerEnabled mob cursor-finalizer-enabled))
|
||||
(.build mob)))
|
||||
|
||||
(defn disconnect
|
||||
|
||||
(defn set-connection!
|
||||
"Sets given MongoDB connection as default by altering *mongodb-connection* var"
|
||||
^MongoClient [^MongoClient conn]
|
||||
(alter-var-root (var *mongodb-connection*) (constantly conn)))
|
||||
|
||||
(defn connect!
|
||||
"Connect to MongoDB, store connection in the *mongodb-connection* var"
|
||||
^MongoClient [& args]
|
||||
(let [c (apply connect args)]
|
||||
(set-connection! c)))
|
||||
|
||||
(defn disconnect!
|
||||
"Closes default connection to MongoDB"
|
||||
[^MongoClient conn]
|
||||
(.close conn))
|
||||
[]
|
||||
(.close *mongodb-connection*))
|
||||
|
||||
(defn set-db!
|
||||
"Sets *mongodb-database* var to given db, updates *mongodb-gridfs* var state. Recommended to be used for
|
||||
applications that only use one database."
|
||||
[db]
|
||||
(alter-var-root (var *mongodb-database*) (constantly db))
|
||||
(alter-var-root (var *mongodb-gridfs*) (constantly (GridFS. db))))
|
||||
|
||||
|
||||
(def ^{:doc "Combines set-db! and get-db, so (use-db \"mydb\") is the same as (set-db! (get-db \"mydb\"))"}
|
||||
use-db! (comp set-db! get-db))
|
||||
|
||||
(def ^:const admin-db-name "admin")
|
||||
|
||||
(defn ^DB admin-db
|
||||
"Returns admin database"
|
||||
[^MongoClient conn]
|
||||
(get-db conn admin-db-name))
|
||||
[]
|
||||
(get-db admin-db-name))
|
||||
|
||||
|
||||
(defn set-default-write-concern!
|
||||
[wc]
|
||||
"Sets *mongodb-write-concert*"
|
||||
"Set *mongodb-write-concert* var to :wc
|
||||
|
||||
Unlike the official Java driver, Monger uses WriteConcern/SAFE by default. We think defaults should be safe first
|
||||
and WebScale fast second."
|
||||
(alter-var-root #'*mongodb-write-concern* (constantly wc)))
|
||||
|
||||
|
||||
(defn connect-via-uri
|
||||
"Connects to MongoDB using a URI, returns the connection and database as a map with :conn and :db.
|
||||
Commonly used for PaaS-based applications, for example, running on Heroku.
|
||||
If username and password are provided, performs authentication."
|
||||
(defn authenticate
|
||||
([^String username ^chars password]
|
||||
(authenticate *mongodb-connection* *mongodb-database* username password))
|
||||
([^DB db ^String username ^chars password]
|
||||
(authenticate *mongodb-connection* db username password))
|
||||
([^MongoClient connection ^DB db ^String username ^chars password]
|
||||
(try
|
||||
(.authenticate db username password)
|
||||
;; MongoDB Java driver's exception hierarchy is a little crazy
|
||||
;; and the exception we want is not a public class. MK.
|
||||
(catch Exception _
|
||||
false))))
|
||||
|
||||
(defn connect-via-uri!
|
||||
"Connects to MongoDB using a URI, sets up default connection and database. Commonly used for PaaS-based applications,
|
||||
for example, running on Heroku. If username and password are provided, performs authentication."
|
||||
[^String uri-string]
|
||||
(let [uri (MongoClientURI. uri-string)
|
||||
conn (MongoClient. uri)]
|
||||
(if-let [dbName (.getDatabase uri)]
|
||||
{:conn conn :db (.getDB conn dbName)}
|
||||
(throw (IllegalArgumentException. "No database name specified in URI. Monger requires a database to be explicitly configured.")))))
|
||||
(let [uri (MongoClientURI. uri-string)
|
||||
conn (MongoClient. uri)
|
||||
db (.getDB conn (.getDatabase uri))
|
||||
user (.getUsername uri)
|
||||
pwd (.getPassword uri)]
|
||||
(when (and user pwd)
|
||||
(when-not (authenticate conn db user pwd)
|
||||
(throw (IllegalArgumentException. (format "Could not authenticate with MongoDB. Either database name or credentials are invalid. Database name: %s, username: %s" (.getName db) user)))))
|
||||
;; only do this *after* we authenticated because set-db! will try to set up a default GridFS instance. MK.
|
||||
(set-connection! conn)
|
||||
(when db
|
||||
(set-db! db))
|
||||
conn))
|
||||
|
||||
|
||||
(defn ^com.mongodb.CommandResult command
|
||||
"Runs a database command (please check MongoDB documentation for the complete list of commands).
|
||||
|
||||
Ordering of keys in the command document may matter. Please use sorted maps instead of map literals, for example:
|
||||
(array-map :near 50 :test 430 :num 10)
|
||||
(sorted-map geoNear \"bars\" :near 50 :test 430 :num 10)
|
||||
|
||||
For commonly used commands (distinct, count, map/reduce, etc), use monger.command and monger.collection functions such as
|
||||
/distinct, /count, /drop, /dropIndexes, and /mapReduce respectively."
|
||||
[^DB database ^Map cmd]
|
||||
(.command ^DB database ^DBObject (to-db-object cmd)))
|
||||
|
||||
(defn ^com.mongodb.CommandResult raw-command
|
||||
"Like monger.core/command but accepts DBObjects"
|
||||
[^DB database ^DBObject cmd]
|
||||
(.command database cmd))
|
||||
([^Map cmd]
|
||||
(.command ^DB *mongodb-database* ^DBObject (to-db-object cmd)))
|
||||
([^DB database ^Map cmd]
|
||||
(.command ^DB database ^DBObject (to-db-object cmd))))
|
||||
|
||||
(defprotocol Countable
|
||||
(count [this] "Returns size of the object"))
|
||||
|
|
@ -308,3 +269,23 @@
|
|||
;; MongoDB Java driver could use a lot more specific type than Iterable but
|
||||
;; it always uses DBCollection#find to popular result set. MK.
|
||||
(.count ^DBCursor (.results this))))
|
||||
|
||||
(defn ^DBObject get-last-error
|
||||
"Returns the the error (if there is one) from the previous operation on this connection.
|
||||
|
||||
The result of this command looks like:
|
||||
|
||||
#<CommandResult { \"serverUsed\" : \"127.0.0.1:27017\" , \"n\" : 0 , \"connectionId\" : 66 , \"err\" : null , \"ok\" : 1.0}>\"
|
||||
|
||||
The value for err will be null if no error occurred, or a description otherwise.
|
||||
|
||||
Important note: when calling this method directly, it is undefined which connection \"getLastError\" is called on.
|
||||
You may need to explicitly use a \"consistent Request\", see requestStart() For most purposes it is better not to call this method directly but instead use WriteConcern."
|
||||
([]
|
||||
(get-last-error *mongodb-database*))
|
||||
([^DB database]
|
||||
(.getLastError ^DB database))
|
||||
([^DB database ^Integer w ^Integer wtimeout ^Boolean fsync]
|
||||
(.getLastError ^DB database w wtimeout fsync))
|
||||
([^DB database ^WriteConcern write-concern]
|
||||
(.getLastError ^DB database write-concern)))
|
||||
|
|
|
|||
|
|
@ -1,56 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.credentials
|
||||
"Helper functions for instantiating various types
|
||||
of credentials."
|
||||
(:require [clojurewerkz.support.chars :refer :all])
|
||||
(:import [com.mongodb MongoCredential]))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^MongoCredential create
|
||||
"Creates a MongoCredential instance with an unspecified mechanism.
|
||||
The client will negotiate the best mechanism based on the
|
||||
version of the server that the client is authenticating to."
|
||||
[^String username ^String database pwd]
|
||||
(MongoCredential/createCredential username database (to-char-array pwd)))
|
||||
|
||||
(defn ^MongoCredential x509
|
||||
"Creates a MongoCredential instance for the X509-based authentication
|
||||
protocol."
|
||||
[^String username]
|
||||
(MongoCredential/createMongoX509Credential username))
|
||||
|
||||
|
|
@ -1,143 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.cursor
|
||||
"Helper-functions for dbCursor object:
|
||||
* to initialize new cursor,
|
||||
* for CRUD functionality of options of dbCursor"
|
||||
(:import [com.mongodb DB DBCursor Bytes]
|
||||
[java.util List Map]
|
||||
[java.lang Integer]
|
||||
[clojure.lang Keyword])
|
||||
(:require [monger.core]
|
||||
[monger.conversion :refer [to-db-object from-db-object as-field-selector]]))
|
||||
|
||||
(defn ^DBCursor make-db-cursor
|
||||
"initializes new db-cursor."
|
||||
([^DB db ^String coll]
|
||||
(make-db-cursor db coll {} {}))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(make-db-cursor db coll ref {}))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(.find
|
||||
(.getCollection db (name coll))
|
||||
(to-db-object ref)
|
||||
(as-field-selector fields))))
|
||||
|
||||
(def cursor-options {:awaitdata Bytes/QUERYOPTION_AWAITDATA
|
||||
;;:exhaust Bytes/QUERYOPTION_EXHAUST - not human settable
|
||||
:notimeout Bytes/QUERYOPTION_NOTIMEOUT
|
||||
:oplogreplay Bytes/QUERYOPTION_OPLOGREPLAY
|
||||
:partial Bytes/QUERYOPTION_PARTIAL
|
||||
:slaveok Bytes/QUERYOPTION_SLAVEOK
|
||||
:tailable Bytes/QUERYOPTION_TAILABLE})
|
||||
|
||||
(defn get-options
|
||||
"Returns map of cursor's options with current state."
|
||||
[^DBCursor db-cur]
|
||||
(into {}
|
||||
(for [[opt option-mask] cursor-options]
|
||||
[opt (< 0 (bit-and (.getOptions db-cur) option-mask))])))
|
||||
|
||||
(defn add-option!
|
||||
[^DBCursor db-cur ^String opt]
|
||||
(.addOption db-cur (get cursor-options (keyword opt) 0)))
|
||||
|
||||
(defn remove-option!
|
||||
[^DBCursor db-cur ^String opt]
|
||||
(.setOptions db-cur (bit-and-not (.getOptions db-cur)
|
||||
(get cursor-options (keyword opt) 0))))
|
||||
|
||||
(defmulti add-options (fn [db-cur opts] (class opts)))
|
||||
(defmethod add-options Map [^DBCursor db-cur options]
|
||||
"Changes options by using map of settings, which key specifies name of settings
|
||||
and boolean value specifies new state of the setting.
|
||||
usage:
|
||||
(add-options db-cur {:notimeout true, :tailable false})
|
||||
returns:
|
||||
^DBCursor object."
|
||||
(doseq [[opt value] (seq options)]
|
||||
(if (= true value)
|
||||
(add-option! db-cur opt)
|
||||
(remove-option! db-cur opt)))
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options List [^DBCursor db-cur options]
|
||||
"Takes list of options and activates these options
|
||||
usage:
|
||||
(add-options db-cur [:notimeout :tailable])
|
||||
returns:
|
||||
^DBCursor object"
|
||||
(doseq [opt (seq options)]
|
||||
(add-option! db-cur opt))
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options Integer [^DBCursor db-cur, option]
|
||||
"Takes com.mongodb.Byte value and adds it to current settings.
|
||||
usage:
|
||||
(add-options db-cur com.mongodb.Bytes/QUERYOPTION_NOTIMEOUT)
|
||||
returns:
|
||||
^DBCursor object"
|
||||
(.addOption db-cur option)
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options Keyword [^DBCursor db-cur, option]
|
||||
"Takes just one keyword as name of settings and applies it to the db-cursor.
|
||||
usage:
|
||||
(add-options db-cur :notimeout)
|
||||
returns:
|
||||
^DBCursor object"
|
||||
(add-option! db-cur option)
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options :default [^DBCursor db-cur, options]
|
||||
"Using add-options with not supported type of options just passes unchanged cursor"
|
||||
db-cur)
|
||||
|
||||
(defn ^DBCursor reset-options
|
||||
"Resets cursor options to default value and returns cursor"
|
||||
[^DBCursor db-cur]
|
||||
(.resetOptions db-cur)
|
||||
db-cur)
|
||||
|
||||
(defmulti format-as (fn [db-cur as] as))
|
||||
|
||||
(defmethod format-as :map [db-cur as]
|
||||
(map #(from-db-object %1 true) db-cur))
|
||||
|
||||
(defmethod format-as :seq [db-cur as]
|
||||
(seq db-cur))
|
||||
|
||||
(defmethod format-as :default [db-cur as]
|
||||
db-cur)
|
||||
|
||||
|
|
@ -1,62 +1,39 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.db
|
||||
"Functions that provide operations on databases"
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty?])
|
||||
(:import [com.mongodb Mongo DB DBCollection])
|
||||
(:require monger.core
|
||||
[monger.conversion :refer :all]))
|
||||
(:require monger.core)
|
||||
(:use monger.conversion))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn add-user
|
||||
"Adds a new user for this db"
|
||||
[^DB db ^String username ^chars password]
|
||||
(.addUser db username password))
|
||||
([^String username, ^chars password]
|
||||
(.addUser ^DB monger.core/*mongodb-database* username password))
|
||||
([^DB database ^String username ^chars password]
|
||||
(.addUser ^DB database username password)))
|
||||
|
||||
|
||||
(defn drop-db
|
||||
"Drops the currently set database (via core/set-db) or the specified database."
|
||||
[^DB db]
|
||||
(.dropDatabase db))
|
||||
([]
|
||||
(.dropDatabase ^DB monger.core/*mongodb-database*))
|
||||
([^DB database]
|
||||
(.dropDatabase ^DB database)))
|
||||
|
||||
(defn get-collection-names
|
||||
"Returns a set containing the names of all collections in this database."
|
||||
([^DB db]
|
||||
(set (.getCollectionNames db))))
|
||||
([]
|
||||
(set (.getCollectionNames ^DB monger.core/*mongodb-database*)))
|
||||
([^DB database]
|
||||
(set (.getCollectionNames ^DB database))))
|
||||
|
|
|
|||
|
|
@ -1,50 +1,26 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.gridfs
|
||||
"Provides functions and macros for working with GridFS: storing files in GridFS, streaming files from GridFS,
|
||||
finding stored files.
|
||||
(ns
|
||||
^{:doc "Provides functions and macros for working with GridFS: storing files in GridFS, streaming files from GridFS,
|
||||
finding stored files.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/gridfs.html"
|
||||
Related documentation guide: http://clojuremongodb.info/articles/gridfs.html"}
|
||||
monger.gridfs
|
||||
(:refer-clojure :exclude [remove find])
|
||||
(:require monger.core
|
||||
[clojure.java.io :as io]
|
||||
[monger.conversion :refer :all]
|
||||
[clojurewerkz.support.fn :refer [fpartial]])
|
||||
[clojure.java.io :as io])
|
||||
(:use monger.conversion
|
||||
[clojurewerkz.support.fn :only [fpartial]])
|
||||
(:import [com.mongodb DB DBObject]
|
||||
org.bson.types.ObjectId
|
||||
[com.mongodb.gridfs GridFS GridFSInputFile]
|
||||
[java.io InputStream ByteArrayInputStream File]))
|
||||
[java.io InputStream File]))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
|
|
@ -66,16 +42,24 @@
|
|||
|
||||
|
||||
(defn remove
|
||||
[^GridFS fs query]
|
||||
(.remove fs ^DBObject (to-db-object query)))
|
||||
([]
|
||||
(remove {}))
|
||||
([query]
|
||||
(.remove ^GridFS monger.core/*mongodb-gridfs* ^DBObject (to-db-object query)))
|
||||
([^GridFS fs query]
|
||||
(.remove fs ^DBObject (to-db-object query))))
|
||||
|
||||
(defn remove-all
|
||||
[^GridFS fs]
|
||||
(remove fs {}))
|
||||
([]
|
||||
(remove {}))
|
||||
([^GridFS fs]
|
||||
(remove fs {})))
|
||||
|
||||
(defn all-files
|
||||
([^GridFS fs]
|
||||
(.getFileList fs (to-db-object {})))
|
||||
([]
|
||||
(.getFileList ^GridFS monger.core/*mongodb-gridfs*))
|
||||
([query]
|
||||
(.getFileList ^GridFS monger.core/*mongodb-gridfs* query))
|
||||
([^GridFS fs query]
|
||||
(.getFileList fs query)))
|
||||
|
||||
|
|
@ -83,8 +67,10 @@
|
|||
(fpartial from-db-object true))
|
||||
|
||||
(defn files-as-maps
|
||||
([^GridFS fs]
|
||||
(files-as-maps fs {}))
|
||||
([]
|
||||
(map converter (all-files)))
|
||||
([query]
|
||||
(map converter (all-files (to-db-object query))))
|
||||
([^GridFS fs query]
|
||||
(map converter (all-files fs (to-db-object query)))))
|
||||
|
||||
|
|
@ -93,51 +79,27 @@
|
|||
;; Plumbing (low-level API)
|
||||
;;
|
||||
|
||||
(defprotocol InputStreamFactory
|
||||
(^InputStream to-input-stream [input] "Makes InputStream out of the given input"))
|
||||
|
||||
(extend byte-array-type
|
||||
InputStreamFactory
|
||||
{:to-input-stream (fn [^bytes input]
|
||||
(ByteArrayInputStream. input))})
|
||||
|
||||
(extend-protocol InputStreamFactory
|
||||
String
|
||||
(to-input-stream [^String input]
|
||||
(io/make-input-stream input {:encoding "UTF-8"}))
|
||||
|
||||
File
|
||||
(to-input-stream [^File input]
|
||||
(io/make-input-stream input {:encoding "UTF-8"}))
|
||||
|
||||
InputStream
|
||||
(to-input-stream [^InputStream input]
|
||||
input))
|
||||
|
||||
(defprotocol GridFSInputFileFactory
|
||||
(^GridFSInputFile create-gridfs-file [input ^GridFS fs] "Creates a file entry"))
|
||||
(^com.mongodb.gridfs.GridFSInputFile make-input-file [input] "Makes GridFSInputFile out of the given input"))
|
||||
|
||||
(extend byte-array-type
|
||||
GridFSInputFileFactory
|
||||
{:create-gridfs-file (fn [^bytes input ^GridFS fs]
|
||||
(.createFile fs input))})
|
||||
{:make-input-file (fn [^bytes input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* input))})
|
||||
|
||||
(extend-protocol GridFSInputFileFactory
|
||||
String
|
||||
(create-gridfs-file [^String input ^GridFS fs]
|
||||
(.createFile fs (io/file input)))
|
||||
(make-input-file [^String input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* ^InputStream (io/make-input-stream input {:encoding "UTF-8"})))
|
||||
|
||||
File
|
||||
(create-gridfs-file [^File input ^GridFS fs]
|
||||
(.createFile fs input))
|
||||
(make-input-file [^File input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* ^InputStream (io/make-input-stream input {:encoding "UTF-8"})))
|
||||
|
||||
InputStream
|
||||
(create-gridfs-file [^InputStream input ^GridFS fs]
|
||||
(.createFile fs input)))
|
||||
(make-input-file [^InputStream input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* ^InputStream input)))
|
||||
|
||||
(defn ^GridFSInputFile make-input-file
|
||||
[^GridFS fs input]
|
||||
(create-gridfs-file input fs))
|
||||
|
||||
(defmacro store
|
||||
[^GridFSInputFile input & body]
|
||||
|
|
@ -145,8 +107,9 @@
|
|||
(.save f# GridFS/DEFAULT_CHUNKSIZE)
|
||||
(from-db-object f# true)))
|
||||
|
||||
|
||||
;;
|
||||
;; Higher-level API
|
||||
;; "New" DSL, a higher-level API
|
||||
;;
|
||||
|
||||
(defn save
|
||||
|
|
@ -179,34 +142,48 @@
|
|||
;; Finders
|
||||
;;
|
||||
|
||||
(defn find
|
||||
[^GridFS fs query]
|
||||
(.find fs (to-db-object query)))
|
||||
(defprotocol Finders
|
||||
(find [input] "Finds multiple files using given input (an ObjectId, filename or query)")
|
||||
(find-one [input] "Finds one file using given input (an ObjectId, filename or query)")
|
||||
(find-maps [input] "Finds multiple files using given input (an ObjectId, filename or query), returning a Clojure map")
|
||||
(find-one-as-map [input] "Finds one file using given input (an ObjectId, filename or query), returning a Clojure map"))
|
||||
|
||||
(defn find-by-filename
|
||||
[^GridFS fs ^String filename]
|
||||
(.find fs (to-db-object {"filename" filename})))
|
||||
(extend-protocol Finders
|
||||
String
|
||||
(find [^String input]
|
||||
(.find ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
(find-one [^String input]
|
||||
(.findOne ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
(find-maps [^String input]
|
||||
(map converter (find input)))
|
||||
(find-one-as-map [^String input]
|
||||
(converter (find-one input)))
|
||||
|
||||
(defn find-by-md5
|
||||
[^GridFS fs ^String md5]
|
||||
(.find fs (to-db-object {"md5" md5})))
|
||||
org.bson.types.ObjectId
|
||||
(find-one [^org.bson.types.ObjectId input]
|
||||
(.findOne ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
(find-one-as-map [^org.bson.types.ObjectId input]
|
||||
(converter (find-one input)))
|
||||
|
||||
(defn find-one
|
||||
[^GridFS fs query]
|
||||
(.findOne fs (to-db-object query)))
|
||||
|
||||
(defn find-maps
|
||||
[^GridFS fs query]
|
||||
(map converter (find fs query)))
|
||||
DBObject
|
||||
(find [^DBObject input]
|
||||
(.find ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
(find-one [^DBObject input]
|
||||
(.findOne ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
(find-maps [^DBObject input]
|
||||
(map converter (find input)))
|
||||
(find-one-as-map [^DBObject input]
|
||||
(converter (find-one input)))
|
||||
|
||||
(defn find-one-as-map
|
||||
[^GridFS fs query]
|
||||
(converter (find-one fs query)))
|
||||
;; using java.util.Map here results in (occasional) recursion
|
||||
clojure.lang.IPersistentMap
|
||||
(find [^java.util.Map input]
|
||||
(find (to-db-object input)))
|
||||
(find-one [^java.util.Map input]
|
||||
(find-one (to-db-object input)))
|
||||
(find-maps [^java.util.Map input]
|
||||
(find-maps (to-db-object input)))
|
||||
(find-one-as-map [^java.util.Map input]
|
||||
(find-one-as-map (to-db-object input))))
|
||||
|
||||
(defn find-by-id
|
||||
[^GridFS fs ^ObjectId id]
|
||||
(.findOne fs id))
|
||||
|
||||
(defn find-map-by-id
|
||||
[^GridFS fs ^ObjectId id]
|
||||
(converter (find-by-id fs id)))
|
||||
|
|
|
|||
73
src/clojure/monger/internal/fn.clj
Normal file
73
src/clojure/monger/internal/fn.clj
Normal file
|
|
@ -0,0 +1,73 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.internal.fn)
|
||||
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(defn- apply-to-values [m f]
|
||||
"Applies function f to all values in map m"
|
||||
(into {} (for [[k v] m]
|
||||
[k (f v)])))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn fpartial
|
||||
"Like clojure.core/partial but prepopulates last N arguments (first is passed in later)"
|
||||
[f & args]
|
||||
(fn [arg & more] (apply f arg (concat args more))))
|
||||
|
||||
(defprotocol IFNExpansion
|
||||
(expand-all [x] "Replaces functions with their invocation results, recursively expands maps, evaluates all other values to themselves")
|
||||
(expand-all-with [x f] "Replaces functions with their invocation results that function f is applied to, recursively expands maps, evaluates all other values to themselves"))
|
||||
|
||||
(extend-protocol IFNExpansion
|
||||
java.lang.Integer
|
||||
(expand-all [i] i)
|
||||
(expand-all-with [i f] i)
|
||||
|
||||
java.lang.Long
|
||||
(expand-all [l] l)
|
||||
(expand-all-with [l f] l)
|
||||
|
||||
java.lang.String
|
||||
(expand-all [s] s)
|
||||
(expand-all-with [s f] s)
|
||||
|
||||
java.lang.Float
|
||||
(expand-all [fl] fl)
|
||||
(expand-all-with [fl f] fl)
|
||||
|
||||
java.lang.Double
|
||||
(expand-all [d] d)
|
||||
(expand-all-with [d f] d)
|
||||
|
||||
;; maps are also functions, so be careful here. MK.
|
||||
clojure.lang.IPersistentMap
|
||||
(expand-all [m] (apply-to-values m expand-all))
|
||||
(expand-all-with [m f] (apply-to-values m (fpartial expand-all-with f)))
|
||||
|
||||
clojure.lang.PersistentVector
|
||||
(expand-all [v] (map expand-all v))
|
||||
(expand-all-with [v f] (map (fpartial expand-all-with f) v))
|
||||
|
||||
;; this distinguishes functions from maps, sets and so on, which are also
|
||||
;; clojure.lang.AFn subclasses. MK.
|
||||
clojure.lang.AFunction
|
||||
(expand-all [f] (f))
|
||||
(expand-all-with [f expander] (expander f))
|
||||
|
||||
Object
|
||||
(expand-all [x] x)
|
||||
(expand-all-with [x f] x))
|
||||
|
|
@ -1,35 +1,11 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.internal.pagination)
|
||||
|
||||
|
|
|
|||
|
|
@ -1,48 +1,23 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.joda-time
|
||||
"An optional convenience namespaces for applications that heavily use dates and would prefer use JodaTime types
|
||||
transparently when storing and loading them from MongoDB and serializing to JSON and/or with Clojure reader.
|
||||
(ns ^{:doc "An optional convenience namespaces for applications that heavily use dates and would prefer use JodaTime types
|
||||
transparently when storing and loading them from MongoDB and serializing to JSON and/or with Clojure reader.
|
||||
|
||||
Enables automatic conversion of JodaTime date/time/instant instances to JDK dates (java.util.Date) when documents
|
||||
are serialized and the other way around when documents are loaded. Extends clojure.data.json/Write-JSON protocol for
|
||||
JodaTime types.
|
||||
Enables automatic conversion of JodaTime date/time/instant instances to JDK dates (java.util.Date) when documents
|
||||
are serialized and the other way around when documents are loaded. Extends clojure.data.json/Write-JSON protocol for
|
||||
JodaTime types.
|
||||
|
||||
To use it, make sure you add dependencies on clj-time (or JodaTime) and clojure.data.json."
|
||||
To use it, make sure you add dependencies on clj-time (or JodaTime) and clojure.data.json."} monger.joda-time
|
||||
(:import [org.joda.time DateTime DateTimeZone ReadableInstant]
|
||||
[org.joda.time.format ISODateTimeFormat])
|
||||
(:require [monger.conversion :refer :all]))
|
||||
(:use [monger.conversion]))
|
||||
|
||||
;;
|
||||
;; API
|
||||
|
|
@ -51,9 +26,6 @@
|
|||
(extend-protocol ConvertToDBObject
|
||||
org.joda.time.base.AbstractInstant
|
||||
(to-db-object [^AbstractInstant input]
|
||||
(to-db-object (.toDate input)))
|
||||
org.joda.time.base.AbstractPartial
|
||||
(to-db-object [^AbstractPartial input]
|
||||
(to-db-object (.toDate input))))
|
||||
|
||||
(extend-protocol ConvertFromDBObject
|
||||
|
|
@ -67,15 +39,23 @@
|
|||
;; Reader extensions
|
||||
;;
|
||||
|
||||
(defmethod print-dup java.util.Date
|
||||
[^java.util.Date d ^java.io.Writer out]
|
||||
(.write out
|
||||
(str "#="
|
||||
`(java.util.Date. ~(.getYear d)
|
||||
~(.getMonth d)
|
||||
~(.getDate d)
|
||||
~(.getHours d)
|
||||
~(.getMinutes d)
|
||||
~(.getSeconds d)))))
|
||||
|
||||
|
||||
(defmethod print-dup org.joda.time.base.AbstractInstant
|
||||
[^org.joda.time.base.AbstractInstant d out]
|
||||
(print-dup (.toDate d) out))
|
||||
|
||||
|
||||
(defmethod print-dup org.joda.time.base.AbstractPartial
|
||||
[^org.joda.time.base.AbstractPartial d out]
|
||||
(print-dup (.toDate d) out))
|
||||
|
||||
;;
|
||||
;; JSON serialization
|
||||
;;
|
||||
|
|
|
|||
|
|
@ -1,38 +1,13 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.js
|
||||
"Kept for backwards compatibility. Please use clojurewerkz.support.js from now on."
|
||||
(ns ^{:doc "Kept for backwards compatibility. Please use clojurewerkz.support.js from now on."} monger.js
|
||||
(:require [clojurewerkz.support.js :as js]))
|
||||
|
||||
|
||||
|
|
|
|||
|
|
@ -1,41 +1,16 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.json
|
||||
"Provides clojure.data.json/Write-JSON protocol extension for MongoDB-specific types, such as
|
||||
org.bson.types.ObjectId"
|
||||
(:import org.bson.types.ObjectId
|
||||
org.bson.types.BSONTimestamp))
|
||||
(ns ^{:doc "Provides clojure.data.json/Write-JSON protocol extension for MongoDB-specific types, such as
|
||||
org.bson.types.ObjectId"}
|
||||
monger.json
|
||||
(:import org.bson.types.ObjectId))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
|
|
@ -70,20 +45,8 @@
|
|||
(try
|
||||
(extend-protocol clojure.data.json/JSONWriter
|
||||
ObjectId
|
||||
(-write
|
||||
([^ObjectId object out]
|
||||
(clojure.data.json/write (.toString object) out))
|
||||
([^ObjectId object out options]
|
||||
(clojure.data.json/write (.toString object) out options))))
|
||||
|
||||
(extend-protocol clojure.data.json/JSONWriter
|
||||
BSONTimestamp
|
||||
(-write
|
||||
([^BSONTimestamp object out]
|
||||
(clojure.data.json/write {:time (.getTime object) :inc (.getInc object)} out))
|
||||
([^BSONTimestamp object out options]
|
||||
(clojure.data.json/write {:time (.getTime object) :inc (.getInc object)} out options))))
|
||||
|
||||
(-write [^ObjectId object out]
|
||||
(clojure.data.json/write (.toString object) out)))
|
||||
(catch Throwable _
|
||||
false))
|
||||
(comment "Nothing to do, clojure.data.json is not available"))
|
||||
|
|
@ -109,8 +72,5 @@
|
|||
(cheshire.generate/add-encoder ObjectId
|
||||
(fn [^ObjectId oid ^com.fasterxml.jackson.core.json.WriterBasedJsonGenerator generator]
|
||||
(.writeString generator (.toString oid))))
|
||||
(cheshire.generate/add-encoder BSONTimestamp
|
||||
(fn [^BSONTimestamp ts ^com.fasterxml.jackson.core.json.WriterBasedJsonGenerator generator]
|
||||
(cheshire.generate/encode-map {:time (.getTime ts) :inc (.getInc ts)} generator)))
|
||||
(catch Throwable t
|
||||
false))
|
||||
|
|
|
|||
360
src/clojure/monger/multi/collection.clj
Normal file
360
src/clojure/monger/multi/collection.clj
Normal file
|
|
@ -0,0 +1,360 @@
|
|||
(ns monger.multi.collection
|
||||
"Includes versions of key monger.collection functions that always take a database
|
||||
as explicit argument instead of relying on monger.core/*mongodb-database*.
|
||||
|
||||
Use these functions when you need to work with multiple databases or manage database
|
||||
and connection lifecycle explicitly."
|
||||
(:refer-clojure :exclude [find remove count empty? distinct drop])
|
||||
(:import [com.mongodb Mongo DB DBCollection WriteResult DBObject WriteConcern DBCursor MapReduceCommand MapReduceCommand$OutputType]
|
||||
[java.util List Map]
|
||||
[clojure.lang IPersistentMap ISeq]
|
||||
org.bson.types.ObjectId)
|
||||
(:require monger.core
|
||||
monger.result)
|
||||
(:use monger.conversion
|
||||
monger.constraints))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^WriteResult insert
|
||||
"Like monger.collection/insert but always takes a database as explicit argument"
|
||||
([^DB db ^String collection document]
|
||||
(.insert (.getCollection db (name collection))
|
||||
(to-db-object document)
|
||||
monger.core/*mongodb-write-concern*))
|
||||
([^DB db ^String collection document ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name collection))
|
||||
(to-db-object document)
|
||||
concern)))
|
||||
|
||||
|
||||
(defn ^clojure.lang.IPersistentMap insert-and-return
|
||||
"Like monger.collection/insert-and-return but always takes a database as explicit argument"
|
||||
([^DB db ^String collection document]
|
||||
(let [doc (merge {:_id (ObjectId.)} document)]
|
||||
(insert db collection doc monger.core/*mongodb-write-concern*)
|
||||
doc))
|
||||
([^DB db ^String collection document ^WriteConcern concern]
|
||||
;; MongoDB Java driver will generate the _id and set it but it tries to mutate the inserted DBObject
|
||||
;; and it does not work very well in our case, because that DBObject is short lived and produced
|
||||
;; from the Clojure map we are passing in. Plus, this approach is very awkward with immutable data
|
||||
;; structures being the default. MK.
|
||||
(let [doc (merge {:_id (ObjectId.)} document)]
|
||||
(insert db collection doc concern)
|
||||
doc)))
|
||||
|
||||
|
||||
(defn ^WriteResult insert-batch
|
||||
"Like monger.collection/insert-batch but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^List documents]
|
||||
(.insert (.getCollection db (name collection))
|
||||
^List (to-db-object documents)
|
||||
monger.core/*mongodb-write-concern*))
|
||||
([^DB db ^String collection ^List documents ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name collection))
|
||||
^List (to-db-object documents)
|
||||
concern)))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/find
|
||||
;;
|
||||
|
||||
(defn ^DBCursor find
|
||||
"Like monger.collection/find but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(.find (.getCollection db (name collection))))
|
||||
([^DB db ^String collection ^Map ref]
|
||||
(.find (.getCollection db (name collection))
|
||||
(to-db-object ref)))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(.find (.getCollection db (name collection))
|
||||
(to-db-object ref)
|
||||
(as-field-selector fields))))
|
||||
|
||||
(defn find-maps
|
||||
"Like monger.collection/find-maps but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(with-open [dbc (find db collection)]
|
||||
(map (fn [x] (from-db-object x true)) dbc)))
|
||||
([^DB db ^String collection ^Map ref]
|
||||
(with-open [dbc (find db collection ref)]
|
||||
(map (fn [x] (from-db-object x true)) dbc)))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(with-open [dbc (find db collection ref fields)]
|
||||
(map (fn [x] (from-db-object x true)) dbc))))
|
||||
|
||||
(defn find-seq
|
||||
"Like monger.collection/find-seq but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(with-open [dbc (find db collection)]
|
||||
(seq dbc)))
|
||||
([^DB db ^String collection ^Map ref]
|
||||
(with-open [dbc (find db collection ref)]
|
||||
(seq dbc)))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(with-open [dbc (find db collection ref fields)]
|
||||
(seq dbc))))
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/find-one
|
||||
;;
|
||||
|
||||
(defn ^DBObject find-one
|
||||
"Like monger.collection/find-one but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map ref]
|
||||
(.findOne (.getCollection db (name collection))
|
||||
(to-db-object ref)))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(.findOne (.getCollection db (name collection))
|
||||
(to-db-object ref)
|
||||
^DBObject (as-field-selector fields))))
|
||||
|
||||
(defn ^IPersistentMap find-one-as-map
|
||||
"Like monger.collection/find-one-as-map but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map ref]
|
||||
(from-db-object ^DBObject (find-one db collection ref) true))
|
||||
([^DB db ^String collection ^Map ref fields]
|
||||
(from-db-object ^DBObject (find-one db collection ref fields) true))
|
||||
([^DB db ^String collection ^Map ref fields keywordize]
|
||||
(from-db-object ^DBObject (find-one db collection ref fields) keywordize)))
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/find-and-modify
|
||||
;;
|
||||
|
||||
(defn ^IPersistentMap find-and-modify
|
||||
"Like monger.collection/find-and-modify but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map conditions ^Map document & {:keys [fields sort remove return-new upsert keywordize] :or
|
||||
{fields nil sort nil remove false return-new false upsert false keywordize true}}]
|
||||
(let [coll (.getCollection db (name collection))
|
||||
maybe-fields (when fields (as-field-selector fields))
|
||||
maybe-sort (when sort (to-db-object sort))]
|
||||
(from-db-object
|
||||
^DBObject (.findAndModify ^DBCollection coll (to-db-object conditions) maybe-fields maybe-sort remove
|
||||
(to-db-object document) return-new upsert) keywordize))))
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/find-by-id
|
||||
;;
|
||||
|
||||
(defn ^DBObject find-by-id
|
||||
"Like monger.collection/find-by-id but always takes a database as explicit argument"
|
||||
([^DB db ^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db collection {:_id id}))
|
||||
([^DB db ^String collection id fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db collection {:_id id} fields)))
|
||||
|
||||
(defn ^IPersistentMap find-map-by-id
|
||||
"Like monger.collection/find-map-by-id but always takes a database as explicit argument"
|
||||
([^DB db ^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(from-db-object ^DBObject (find-one-as-map db collection {:_id id}) true))
|
||||
([^DB db ^String collection id fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(from-db-object ^DBObject (find-one-as-map db collection {:_id id} fields) true))
|
||||
([^DB db ^String collection id fields keywordize]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(from-db-object ^DBObject (find-one-as-map db collection {:_id id} fields) keywordize)))
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/count
|
||||
;;
|
||||
|
||||
(defn count
|
||||
"Like monger.collection/count but always takes a database as explicit argument"
|
||||
(^long [^DB db ^String collection]
|
||||
(.count (.getCollection db (name collection)) (to-db-object {})))
|
||||
(^long [^DB db ^String collection ^Map conditions]
|
||||
(.count (.getCollection db (name collection)) (to-db-object conditions))))
|
||||
|
||||
(defn any?
|
||||
"Like monger.collection/any? but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(> (count db collection) 0))
|
||||
([^DB db ^String collection ^Map conditions]
|
||||
(> (count db collection conditions) 0)))
|
||||
|
||||
(defn empty?
|
||||
"Like monger.collection/empty? but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(= (count db collection {}) 0)))
|
||||
|
||||
(defn ^WriteResult update
|
||||
"Like monger.collection/update but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map conditions ^Map document & {:keys [upsert multi write-concern] :or {upsert false
|
||||
multi false
|
||||
write-concern monger.core/*mongodb-write-concern*}}]
|
||||
(.update (.getCollection db (name collection))
|
||||
(to-db-object conditions)
|
||||
(to-db-object document)
|
||||
upsert
|
||||
multi
|
||||
write-concern)))
|
||||
|
||||
(defn ^WriteResult upsert
|
||||
"Like monger.collection/upsert but always takes a database as explicit argument"
|
||||
[^DB db ^String collection ^Map conditions ^Map document & {:keys [multi write-concern] :or {multi false
|
||||
write-concern monger.core/*mongodb-write-concern*}}]
|
||||
(update db collection conditions document :multi multi :write-concern write-concern :upsert true))
|
||||
|
||||
(defn ^WriteResult update-by-id
|
||||
"Like monger.collection/update-by-id but always takes a database as explicit argument"
|
||||
[^DB db ^String collection id ^Map document & {:keys [upsert write-concern] :or {upsert false
|
||||
write-concern monger.core/*mongodb-write-concern*}}]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(.update (.getCollection db (name collection))
|
||||
(to-db-object {:_id id})
|
||||
(to-db-object document)
|
||||
upsert
|
||||
false
|
||||
write-concern))
|
||||
|
||||
(defn ^WriteResult save
|
||||
"Like monger.collection/save but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map document]
|
||||
(.save (.getCollection db (name collection))
|
||||
(to-db-object document)
|
||||
monger.core/*mongodb-write-concern*))
|
||||
([^DB db ^String collection ^Map document ^WriteConcern write-concern]
|
||||
(.save (.getCollection db (name collection))
|
||||
(to-db-object document)
|
||||
write-concern)))
|
||||
|
||||
(defn ^clojure.lang.IPersistentMap save-and-return
|
||||
"Like monger.collection/save-and-return but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map document]
|
||||
(save-and-return ^DB db collection document ^WriteConcern monger.core/*mongodb-write-concern*))
|
||||
([^DB db ^String collection ^Map document ^WriteConcern write-concern]
|
||||
;; see the comment in insert-and-return. Here we additionally need to make sure to not scrap the :_id key if
|
||||
;; it is already present. MK.
|
||||
(let [doc (merge {:_id (ObjectId.)} document)]
|
||||
(save db collection doc write-concern)
|
||||
doc)))
|
||||
|
||||
(defn ^WriteResult remove
|
||||
"Like monger.collection/remove but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(.remove (.getCollection db (name collection)) (to-db-object {})))
|
||||
([^DB db ^String collection ^Map conditions]
|
||||
(.remove (.getCollection db (name collection)) (to-db-object conditions))))
|
||||
|
||||
(defn ^WriteResult remove-by-id
|
||||
"Like monger.collection/remove-by-id but always takes a database as explicit argument"
|
||||
([^DB db ^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(let [coll (.getCollection db (name collection))]
|
||||
(.remove coll (to-db-object {:_id id})))))
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/create-index
|
||||
;;
|
||||
|
||||
(defn create-index
|
||||
"Like monger.collection/create-index but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map keys]
|
||||
(.createIndex (.getCollection db (name collection)) (as-field-selector keys)))
|
||||
([^DB db ^String collection ^Map keys ^Map options]
|
||||
(.createIndex (.getCollection db (name collection))
|
||||
(as-field-selector keys)
|
||||
(to-db-object options))))
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/ensure-index
|
||||
;;
|
||||
|
||||
(defn ensure-index
|
||||
"Like monger.collection/ensure-index but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map keys]
|
||||
(.ensureIndex (.getCollection db (name collection)) (as-field-selector keys)))
|
||||
([^DB db ^String collection ^Map keys ^Map options]
|
||||
(.ensureIndex (.getCollection db (name collection))
|
||||
(as-field-selector keys)
|
||||
(to-db-object options)))
|
||||
([^DB db ^String collection ^Map keys ^String name ^Boolean unique?]
|
||||
(.ensureIndex (.getCollection db (name collection))
|
||||
(as-field-selector keys)
|
||||
name
|
||||
unique?)))
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/indexes-on
|
||||
;;
|
||||
|
||||
(defn indexes-on
|
||||
"Like monger.collection/indexes-on but always takes a database as explicit argument"
|
||||
[^DB db ^String collection]
|
||||
(from-db-object (.getIndexInfo (.getCollection db (name collection))) true))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/drop-index
|
||||
;;
|
||||
|
||||
(defn drop-index
|
||||
"Like monger.collection/drop-index but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^String idx-name]
|
||||
(.dropIndex (.getCollection db (name collection)) idx-name)))
|
||||
|
||||
(defn drop-indexes
|
||||
"Like monger.collection/drop-indexes but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(.dropIndexes (.getCollection db (name collection)))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/exists?, /create, /drop, /rename
|
||||
;;
|
||||
|
||||
|
||||
(defn exists?
|
||||
"Like monger.collection/exists? but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(.collectionExists db collection)))
|
||||
|
||||
(defn create
|
||||
"Like monger.collection/create but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^Map options]
|
||||
(.createCollection db collection (to-db-object options))))
|
||||
|
||||
(defn drop
|
||||
"Like monger.collection/drop but always takes a database as explicit argument"
|
||||
([^DB db ^String collection]
|
||||
(.drop (.getCollection db (name collection)))))
|
||||
|
||||
(defn rename
|
||||
"Like monger.collection/rename but always takes a database as explicit argument"
|
||||
([^DB db ^String from, ^String to]
|
||||
(.rename (.getCollection db from) to))
|
||||
([^DB db ^String from ^String to ^Boolean drop-target]
|
||||
(.rename (.getCollection db from) to drop-target)))
|
||||
|
||||
;;
|
||||
;; Map/Reduce
|
||||
;;
|
||||
|
||||
(defn map-reduce
|
||||
"Like monger.collection/map-reduce but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^String js-mapper ^String js-reducer ^String output ^Map query]
|
||||
(let [coll (.getCollection db (name collection))]
|
||||
(.mapReduce coll js-mapper js-reducer output (to-db-object query))))
|
||||
([^DB db ^String collection ^String js-mapper ^String js-reducer ^String output ^MapReduceCommand$OutputType output-type ^Map query]
|
||||
(let [coll (.getCollection db (name collection))]
|
||||
(.mapReduce coll js-mapper js-reducer output output-type (to-db-object query)))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.multi.collection/distinct
|
||||
;;
|
||||
|
||||
(defn distinct
|
||||
"Like monger.collection/distinct but always takes a database as explicit argument"
|
||||
([^DB db ^String collection ^String key]
|
||||
(.distinct (.getCollection db (name collection)) ^String (to-db-object key)))
|
||||
([^DB db ^String collection ^String key ^Map query]
|
||||
(.distinct (.getCollection db (name collection)) ^String (to-db-object key) (to-db-object query))))
|
||||
|
|
@ -1,42 +1,18 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.operators
|
||||
"Provides vars that represent various MongoDB operators, for example, $gt or $in or $regex.
|
||||
They can be passed in queries as strings but using vars from this namespace makes the code
|
||||
a bit cleaner and closer to what you would see in a MongoDB shell query.
|
||||
(ns ^{:doc "Provides vars that represent various MongoDB operators, for example, $gt or $in or $regex.
|
||||
They can be passed in queries as strings but using vars from this namespace makes the code
|
||||
a bit cleaner and closer to what you would see in a MongoDB shell query.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/querying.html")
|
||||
Related documentation guide: http://clojuremongodb.info/articles/querying.html"}
|
||||
monger.operators)
|
||||
|
||||
(defmacro ^{:private true} defoperator
|
||||
[operator]
|
||||
|
|
@ -46,8 +22,6 @@
|
|||
;; QUERY OPERATORS
|
||||
;;
|
||||
|
||||
(declare $gt $gte $lt $lte $all $in $nin $eq $ne $elemMatch $regex $options)
|
||||
|
||||
;; $gt is "greater than" comparator
|
||||
;; $gte is "greater than or equals" comparator
|
||||
;; $gt is "less than" comparator
|
||||
|
|
@ -82,16 +56,10 @@
|
|||
;; (mgcol/find-maps "languages" { :tags { $nin [ "functional" ] } } )
|
||||
(defoperator $nin)
|
||||
|
||||
;; $eq is "equals" comparator
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find "libraries" { :language { $eq "Clojure" }})
|
||||
(defoperator $eq)
|
||||
|
||||
;; $ne is "non-equals" comparator
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find "libraries" { :language { $ne "Clojure" }})
|
||||
;; (monger.collection/find "libraries" {$ne { :language "Clojure" }})
|
||||
(defoperator $ne)
|
||||
|
||||
;; $elemMatch checks if an element in an array matches the specified expression
|
||||
|
|
@ -104,37 +72,6 @@
|
|||
(defoperator $regex)
|
||||
(defoperator $options)
|
||||
|
||||
;; comment on a query predicate
|
||||
|
||||
(declare $comment $explain $hint $maxTimeMS $orderBy $query $returnKey $showDiskLoc $natural)
|
||||
|
||||
(defoperator $comment)
|
||||
(defoperator $explain)
|
||||
(defoperator $hint)
|
||||
(defoperator $maxTimeMS)
|
||||
(defoperator $orderBy)
|
||||
(defoperator $query)
|
||||
(defoperator $returnKey)
|
||||
(defoperator $showDiskLoc)
|
||||
(defoperator $natural)
|
||||
|
||||
|
||||
;;
|
||||
;; EVALUATION (QUERY)
|
||||
;;
|
||||
|
||||
(declare $expr $jsonSchema $where $and $or $nor)
|
||||
|
||||
(defoperator $expr)
|
||||
(defoperator $jsonSchema)
|
||||
|
||||
;; Matches documents that satisfy a JavaScript expression.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; (monger.collection/find "people" { $where "this.placeOfBirth === this.address.city" })
|
||||
(defoperator $where)
|
||||
|
||||
;;
|
||||
;; LOGIC OPERATORS
|
||||
;;
|
||||
|
|
@ -167,8 +104,6 @@
|
|||
;; ATOMIC MODIFIERS
|
||||
;;
|
||||
|
||||
(declare $inc $mul $set $unset $setOnInsert $rename $push $position $each $addToSet $pop $pull $pullAll $bit $bitsAllClear $bitsAllSet $bitsAnyClear $bitsAnySet $exists $mod $size $type $not)
|
||||
|
||||
;; $inc increments one or many fields for the given value, otherwise sets the field to value
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
|
|
@ -176,8 +111,6 @@
|
|||
;; (monger.collection/update "scores" { :_id user-id } { :score 20 :bonus 10 } })
|
||||
(defoperator $inc)
|
||||
|
||||
(defoperator $mul)
|
||||
|
||||
;; $set sets an existing (or non-existing) field (or set of fields) to value
|
||||
;; $set supports all datatypes.
|
||||
;;
|
||||
|
|
@ -212,17 +145,12 @@
|
|||
;; (mgcol/update "docs" { :_id oid } { $push { :tags "modifiers" } })
|
||||
(defoperator $push)
|
||||
|
||||
;; $position modifies the behavior of $push per https://docs.mongodb.com/manual/reference/operator/update/position/
|
||||
(defoperator $position)
|
||||
|
||||
;; $each is a modifier for the $push and $addToSet operators for appending multiple values to an array field.
|
||||
;; Without the $each modifier $push and $addToSet will append an array as a single value.
|
||||
;; MongoDB 2.4 adds support for the $each modifier to the $push operator.
|
||||
;; In MongoDB 2.2 the $each modifier can only be used with the $addToSet operator.
|
||||
;; $pushAll appends each value in value_array to field, if field is an existing array, otherwise sets field to the array value_array
|
||||
;; if field is not present. If field is present but is not an array, an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $push { :tags { $each ["mongodb" "docs"] } } })
|
||||
(defoperator $each)
|
||||
;; (mgcol/update coll { :_id oid } { $pushAll { :tags ["mongodb" "docs"] } })
|
||||
(defoperator $pushAll)
|
||||
|
||||
;; $addToSet Adds value to the array only if its not in the array already, if field is an existing array, otherwise sets field to the
|
||||
;; array value if field is not present. If field is present but is not an array, an error condition is raised.
|
||||
|
|
@ -250,15 +178,11 @@
|
|||
;; an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pullAll { :measurements 1.2 } })
|
||||
;; (mgcol/update coll { :_id oid } { $pullAll { :measurements { $gte 1.2 } } })
|
||||
;; (mgcol/update coll { :_id oid } { $pull { :measurements 1.2 } })
|
||||
;; (mgcol/update coll { :_id oid } { $pull { :measurements { $gte 1.2 } } })
|
||||
(defoperator $pullAll)
|
||||
|
||||
(defoperator $bit)
|
||||
(defoperator $bitsAllClear)
|
||||
(defoperator $bitsAllSet)
|
||||
(defoperator $bitsAnyClear)
|
||||
(defoperator $bitsAnySet)
|
||||
|
||||
(defoperator $exists)
|
||||
(defoperator $mod)
|
||||
|
|
@ -268,132 +192,33 @@
|
|||
|
||||
|
||||
;;
|
||||
;; Aggregation in 4.2
|
||||
;; Aggregation in 2.2
|
||||
;;
|
||||
|
||||
(declare $addFields $bucket $bucketAuto $collStats $facet $geoNear $graphLookup $indexStats $listSessions $lookup $match $merge $out $planCacheStats $project $redact $replaceRoot $replaceWith $sample $limit $skip $unwind $group $sort $sortByCount $currentOp $listLocalSessions $cmp $min $max $avg $stdDevPop $stdDevSamp $sum $let $first $last $abs $add $ceil $divide $exp $floor $ln $log $log10 $multiply $pow $round $sqrt $subtract $trunc $literal $arrayElemAt $arrayToObject $concatArrays $filter $indexOfArray $isArray $map $objectToArray $range $reduce $reverseArray $zip $mergeObjects $allElementsTrue $anyElementsTrue $setDifference $setEquals $setIntersection $setIsSubset $setUnion $strcasecmp $substr $substrBytes $substrCP $toLower $toString $toUpper $concat $indexOfBytes $indexOfCP $ltrim $regexFind $regexFindAll $regexMatch $rtrim $split $strLenBytes $subLenCP $trim $sin $cos $tan $asin $acos $atan $atan2 $asinh $acosh $atanh $radiansToDegrees $degreesToRadians $convert $toBool $toDecimal $toDouble $toInt $toLong $toObjectId $dayOfMonth $dayOfWeek $dayOfYear $hour $minute $month $second $millisecond $week $year $isoDate $dateFromParts $dateFromString $dateToParts $dateToString $isoDayOfWeek $isoWeek $isoWeekYear $toDate $ifNull $cond $switch)
|
||||
|
||||
(defoperator $addFields)
|
||||
(defoperator $bucket)
|
||||
(defoperator $bucketAuto)
|
||||
(defoperator $collStats)
|
||||
(defoperator $facet)
|
||||
(defoperator $geoNear)
|
||||
(defoperator $graphLookup)
|
||||
(defoperator $indexStats)
|
||||
(defoperator $listSessions)
|
||||
(defoperator $lookup)
|
||||
(defoperator $match)
|
||||
(defoperator $merge)
|
||||
(defoperator $out)
|
||||
(defoperator $planCacheStats)
|
||||
(defoperator $project)
|
||||
(defoperator $redact)
|
||||
(defoperator $replaceRoot)
|
||||
(defoperator $replaceWith)
|
||||
(defoperator $sample)
|
||||
(defoperator $limit)
|
||||
(defoperator $skip)
|
||||
(defoperator $unwind)
|
||||
(defoperator $group)
|
||||
(defoperator $sort)
|
||||
(defoperator $sortByCount)
|
||||
|
||||
(defoperator $currentOp)
|
||||
(defoperator $listLocalSessions)
|
||||
|
||||
(defoperator $cmp)
|
||||
|
||||
(defoperator $min)
|
||||
(defoperator $max)
|
||||
(defoperator $avg)
|
||||
(defoperator $stdDevPop)
|
||||
(defoperator $stdDevSamp)
|
||||
(defoperator $sum)
|
||||
(defoperator $let)
|
||||
|
||||
(defoperator $first)
|
||||
(defoperator $last)
|
||||
|
||||
(defoperator $abs)
|
||||
(defoperator $add)
|
||||
(defoperator $ceil)
|
||||
(defoperator $divide)
|
||||
(defoperator $exp)
|
||||
(defoperator $floor)
|
||||
(defoperator $ln)
|
||||
(defoperator $log)
|
||||
(defoperator $log10)
|
||||
(defoperator $multiply)
|
||||
(defoperator $pow)
|
||||
(defoperator $round)
|
||||
(defoperator $sqrt)
|
||||
(defoperator $subtract)
|
||||
(defoperator $trunc)
|
||||
(defoperator $literal)
|
||||
|
||||
(defoperator $arrayElemAt)
|
||||
(defoperator $arrayToObject)
|
||||
(defoperator $concatArrays)
|
||||
(defoperator $filter)
|
||||
(defoperator $indexOfArray)
|
||||
(defoperator $isArray)
|
||||
(defoperator $map)
|
||||
(defoperator $objectToArray)
|
||||
(defoperator $range)
|
||||
(defoperator $reduce)
|
||||
(defoperator $reverseArray)
|
||||
(defoperator $zip)
|
||||
(defoperator $mergeObjects)
|
||||
|
||||
(defoperator $allElementsTrue)
|
||||
(defoperator $anyElementsTrue)
|
||||
(defoperator $setDifference)
|
||||
(defoperator $setEquals)
|
||||
(defoperator $setIntersection)
|
||||
(defoperator $setIsSubset)
|
||||
(defoperator $setUnion)
|
||||
(defoperator $substract)
|
||||
|
||||
(defoperator $strcasecmp)
|
||||
(defoperator $substr)
|
||||
(defoperator $substrBytes)
|
||||
(defoperator $substrCP)
|
||||
(defoperator $toLower)
|
||||
(defoperator $toString)
|
||||
(defoperator $toUpper)
|
||||
(defoperator $concat)
|
||||
(defoperator $indexOfBytes)
|
||||
(defoperator $indexOfCP)
|
||||
(defoperator $ltrim)
|
||||
(defoperator $regexFind)
|
||||
(defoperator $regexFindAll)
|
||||
(defoperator $regexMatch)
|
||||
(defoperator $rtrim)
|
||||
(defoperator $split)
|
||||
(defoperator $strLenBytes)
|
||||
(defoperator $subLenCP)
|
||||
(defoperator $trim)
|
||||
|
||||
(defoperator $sin)
|
||||
(defoperator $cos)
|
||||
(defoperator $tan)
|
||||
(defoperator $asin)
|
||||
(defoperator $acos)
|
||||
(defoperator $atan)
|
||||
(defoperator $atan2)
|
||||
(defoperator $asinh)
|
||||
(defoperator $acosh)
|
||||
(defoperator $atanh)
|
||||
(defoperator $radiansToDegrees)
|
||||
(defoperator $degreesToRadians)
|
||||
|
||||
(defoperator $convert)
|
||||
(defoperator $toBool)
|
||||
(defoperator $toDecimal)
|
||||
(defoperator $toDouble)
|
||||
(defoperator $toInt)
|
||||
(defoperator $toLong)
|
||||
(defoperator $toObjectId)
|
||||
|
||||
(defoperator $dayOfMonth)
|
||||
(defoperator $dayOfWeek)
|
||||
|
|
@ -406,54 +231,13 @@
|
|||
(defoperator $week)
|
||||
(defoperator $year)
|
||||
(defoperator $isoDate)
|
||||
(defoperator $dateFromParts)
|
||||
(defoperator $dateFromString)
|
||||
(defoperator $dateToParts)
|
||||
(defoperator $dateToString)
|
||||
(defoperator $isoDayOfWeek)
|
||||
(defoperator $isoWeek)
|
||||
(defoperator $isoWeekYear)
|
||||
(defoperator $toDate)
|
||||
|
||||
|
||||
(defoperator $ifNull)
|
||||
(defoperator $cond)
|
||||
(defoperator $switch)
|
||||
|
||||
;; Geospatial
|
||||
(declare $geoWithin $geoIntersects $near $nearSphere $geometry $maxDistance $minDistance $center $centerSphere $box $polygon $slice)
|
||||
(defoperator $geoWithin)
|
||||
(defoperator $geoIntersects)
|
||||
(defoperator $near)
|
||||
(defoperator $nearSphere)
|
||||
(defoperator $geometry)
|
||||
(defoperator $maxDistance)
|
||||
(defoperator $minDistance)
|
||||
(defoperator $center)
|
||||
(defoperator $centerSphere)
|
||||
(defoperator $box)
|
||||
(defoperator $polygon)
|
||||
|
||||
(defoperator $slice)
|
||||
|
||||
;; full text search
|
||||
(declare $text $meta $search $language $natural $currentDate $isolated $count)
|
||||
(defoperator $text)
|
||||
(defoperator $meta)
|
||||
(defoperator $search)
|
||||
(defoperator $language)
|
||||
(defoperator $natural)
|
||||
|
||||
;; $currentDate operator sets the value of a field to the current date, either as a Date or a timestamp. The default type is Date.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $currentDate { :lastModified true } })
|
||||
(defoperator $currentDate)
|
||||
|
||||
;; Isolates intermediate multi-document updates from other clients.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update "libraries" { :language "Clojure", $isolated 1 } { $inc { :popularity 1 } } {:multi true})
|
||||
(defoperator $isolated)
|
||||
|
||||
(defoperator $count)
|
||||
|
|
@ -1,56 +1,31 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.query
|
||||
"Provides an expressive Query DSL that is very close to that in the Mongo shell (within reason).
|
||||
This is the most flexible and recommended way to query with Monger. Queries can be composed, like in Korma.
|
||||
(ns ^{:doc "Provides an expressive Query DSL that is very close to that in the Mongo shell (within reason).
|
||||
This is the most flexible and recommended way to query with Monger. Queries can be composed, like in Korma.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/querying.html"
|
||||
Related documentation guide: http://clojuremongodb.info/articles/querying.html"}
|
||||
monger.query
|
||||
(:refer-clojure :exclude [select find sort])
|
||||
(:require [monger.core]
|
||||
[monger.internal pagination]
|
||||
[monger.cursor :as cursor :refer [add-options]]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.operators :refer :all])
|
||||
[monger.internal pagination])
|
||||
(:import [com.mongodb DB DBCollection DBObject DBCursor ReadPreference]
|
||||
[java.util.concurrent TimeUnit]
|
||||
java.util.List))
|
||||
[java.util List])
|
||||
(:use [monger conversion operators]))
|
||||
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(def ^{:dynamic true} *query-collection*)
|
||||
|
||||
;;
|
||||
;; Cursor/chain methods
|
||||
;;
|
||||
|
|
@ -66,6 +41,7 @@
|
|||
;; :skip - Skips the first N results.
|
||||
;; :limit - Returns a maximum of N results.
|
||||
;; :batch-size - limits the nubmer of elements returned in one batch.
|
||||
;; :hint - force Mongo to use a specific index for a query in order to improve performance.
|
||||
;; :snapshot - sses snapshot mode for the query. Snapshot mode assures no duplicates are returned, or objects missed
|
||||
;; which were present at both the start and end of the query's execution (if an object is new during the query, or
|
||||
;; deleted during the query, it may or may not be returned, even with snapshot mode). Note that short query responses
|
||||
|
|
@ -79,6 +55,7 @@
|
|||
:skip 0
|
||||
:limit 0
|
||||
:batch-size 256
|
||||
:hint nil
|
||||
:snapshot false
|
||||
:keywordize-fields true
|
||||
})
|
||||
|
|
@ -86,35 +63,19 @@
|
|||
(merge (empty-query) { :collection coll })))
|
||||
|
||||
(defn exec
|
||||
[{:keys [^DBCollection collection
|
||||
query
|
||||
fields
|
||||
skip
|
||||
limit
|
||||
sort
|
||||
batch-size
|
||||
hint
|
||||
snapshot
|
||||
read-preference
|
||||
keywordize-fields
|
||||
max-time
|
||||
options]
|
||||
:or { limit 0 batch-size 256 skip 0 } }]
|
||||
[{ :keys [^DBCollection collection query fields skip limit sort batch-size hint snapshot read-preference keywordize-fields options] :or { limit 0 batch-size 256 skip 0 } }]
|
||||
(with-open [cursor (doto (.find collection (to-db-object query) (as-field-selector fields))
|
||||
(.limit limit)
|
||||
(.skip skip)
|
||||
(.sort (to-db-object sort))
|
||||
(.batchSize batch-size))]
|
||||
(.batchSize batch-size)
|
||||
(.hint (to-db-object hint)))]
|
||||
(when snapshot
|
||||
(.snapshot cursor))
|
||||
(when hint
|
||||
(.hint cursor (to-db-object hint)))
|
||||
(when read-preference
|
||||
(.setReadPreference cursor read-preference))
|
||||
(when max-time
|
||||
(.maxTime cursor max-time TimeUnit/MILLISECONDS))
|
||||
(when options
|
||||
(add-options cursor options))
|
||||
(.setOptions cursor options))
|
||||
(map (fn [x] (from-db-object x keywordize-fields))
|
||||
cursor)))
|
||||
|
||||
|
|
@ -158,10 +119,6 @@
|
|||
[m ^ReadPreference rp]
|
||||
(merge m { :read-preference rp }))
|
||||
|
||||
(defn max-time
|
||||
[m ^long max-time]
|
||||
(merge m { :max-time max-time }))
|
||||
|
||||
(defn options
|
||||
[m opts]
|
||||
(merge m { :options opts }))
|
||||
|
|
@ -175,14 +132,12 @@
|
|||
(merge m { :limit per-page :skip (monger.internal.pagination/offset-for page per-page) }))
|
||||
|
||||
(defmacro with-collection
|
||||
[db coll & body]
|
||||
`(let [coll# ~coll
|
||||
^DB db# ~db
|
||||
db-coll# (if (string? coll#)
|
||||
(.getCollection db# coll#)
|
||||
coll#)
|
||||
query# (-> (empty-query db-coll#) ~@body)]
|
||||
(exec query#)))
|
||||
[^String coll & body]
|
||||
`(binding [*query-collection* (if (string? ~coll)
|
||||
(.getCollection ^DB monger.core/*mongodb-database* ~coll)
|
||||
~coll)]
|
||||
(let [query# (-> (empty-query *query-collection*) ~@body)]
|
||||
(exec query#))))
|
||||
|
||||
(defmacro partial-query
|
||||
[& body]
|
||||
|
|
|
|||
|
|
@ -1,43 +1,9 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.ragtime
|
||||
"Ragtime integration"
|
||||
(:refer-clojure :exclude [find sort])
|
||||
(:require [ragtime.protocols :as proto]
|
||||
(:require [ragtime.core :as ragtime]
|
||||
[monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.query :refer [with-collection find sort]])
|
||||
[monger.collection :as mc])
|
||||
(:use [monger.query :only [with-collection find sort]])
|
||||
(:import java.util.Date
|
||||
[com.mongodb DB WriteConcern]))
|
||||
|
||||
|
|
@ -46,20 +12,23 @@
|
|||
migrations-collection "meta.migrations")
|
||||
|
||||
|
||||
|
||||
(extend-type com.mongodb.DB
|
||||
proto/DataStore
|
||||
ragtime/Migratable
|
||||
(add-migration-id [db id]
|
||||
(mc/insert db migrations-collection {:_id id :created_at (Date.)} WriteConcern/FSYNC_SAFE))
|
||||
(remove-migration-id [db id]
|
||||
(mc/remove-by-id db migrations-collection id))
|
||||
(applied-migration-ids [db]
|
||||
(let [xs (with-collection db migrations-collection
|
||||
(find {})
|
||||
(sort {:created_at 1}))]
|
||||
(vec (map :_id xs)))))
|
||||
(mg/with-db db
|
||||
(let [xs (with-collection migrations-collection
|
||||
(find {})
|
||||
(sort {:created_at 1}))]
|
||||
(vec (map :_id xs))))))
|
||||
|
||||
|
||||
(defn flush-migrations!
|
||||
"REMOVES all the information about previously performed migrations"
|
||||
[^DB db]
|
||||
(mc/remove db migrations-collection))
|
||||
[db]
|
||||
(mg/with-db db
|
||||
(mc/remove migrations-collection)))
|
||||
|
|
|
|||
|
|
@ -1,72 +1,83 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.result
|
||||
"Provides functions that determine if a query (or other database operation)
|
||||
was successful or not.
|
||||
(ns ^{:doc "Provides functions that determine if a query (or other database operation)
|
||||
was successful or not.
|
||||
|
||||
Related documentation guides:
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/inserting.html
|
||||
* http://clojuremongodb.info/articles/updating.html
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/mapreduce.html"
|
||||
(:import [com.mongodb WriteResult CommandResult])
|
||||
* http://clojuremongodb.info/articles/inserting.html
|
||||
* http://clojuremongodb.info/articles/updating.html
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/mapreduce.html"}
|
||||
monger.result
|
||||
(:import [com.mongodb DBObject WriteResult MapReduceOutput]
|
||||
clojure.lang.IPersistentMap)
|
||||
(:require monger.conversion))
|
||||
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(defn- okayish?
|
||||
[value]
|
||||
(contains? #{true "true" 1 1.0} value))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defprotocol WriteResultPredicates
|
||||
(acknowledged? [input] "Returns true if write result is a success")
|
||||
(updated-existing? [input] "Returns true if write result has updated an existing document"))
|
||||
(defprotocol MongoCommandResult
|
||||
(ok? [input] "Returns true if command result is a success")
|
||||
(has-error? [input] "Returns true if command result indicates an error")
|
||||
(updated-existing? [input] "Returns true if command result has `updatedExisting` field set to true"))
|
||||
|
||||
(extend-protocol MongoCommandResult
|
||||
DBObject
|
||||
(ok?
|
||||
[^DBObject result]
|
||||
(okayish? (.get result "ok")))
|
||||
(has-error?
|
||||
[^DBObject result]
|
||||
;; yes, this is exactly the logic MongoDB Java driver uses.
|
||||
(> (count (str (.get result "err"))) 0))
|
||||
(updated-existing?
|
||||
[^DBObject result]
|
||||
(let [v ^Boolean (.get result "updatedExisting")]
|
||||
(and v (Boolean/valueOf v))))
|
||||
|
||||
|
||||
(extend-protocol WriteResultPredicates
|
||||
WriteResult
|
||||
(acknowledged?
|
||||
(ok?
|
||||
[^WriteResult result]
|
||||
(.wasAcknowledged result))
|
||||
(and (not (nil? result)) (ok? (.getLastError result))))
|
||||
(has-error?
|
||||
[^WriteResult result]
|
||||
(has-error? (.getLastError result)))
|
||||
(updated-existing?
|
||||
[^WriteResult result]
|
||||
(.isUpdateOfExisting result))
|
||||
(updated-existing? (.getLastError result)))
|
||||
|
||||
CommandResult
|
||||
(acknowledged?
|
||||
[^CommandResult result]
|
||||
(.ok result)))
|
||||
MapReduceOutput
|
||||
(ok?
|
||||
[^MapReduceOutput result]
|
||||
(ok? ^DBObject (.getRaw result)))
|
||||
|
||||
(defn affected-count
|
||||
"Get the number of documents affected"
|
||||
[^WriteResult result]
|
||||
(.getN result))
|
||||
Boolean
|
||||
(ok?
|
||||
[^Boolean b]
|
||||
(= Boolean/TRUE b))
|
||||
|
||||
IPersistentMap
|
||||
(ok?
|
||||
[^IPersistentMap m]
|
||||
(okayish? (or (get m :ok)
|
||||
(get m "ok")))))
|
||||
|
|
|
|||
|
|
@ -1,41 +1,8 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.ring.session-store
|
||||
(:require [ring.middleware.session.store :as ringstore]
|
||||
[monger.collection :as mc]
|
||||
[monger.core :as mg]
|
||||
[monger.conversion :refer :all])
|
||||
[monger.multi.collection :as mc]
|
||||
[monger.core :as mg])
|
||||
(:use monger.conversion)
|
||||
(:import [java.util UUID Date]
|
||||
[com.mongodb DB]
|
||||
ring.middleware.session.store.SessionStore))
|
||||
|
|
@ -102,8 +69,12 @@
|
|||
|
||||
|
||||
(defn session-store
|
||||
[^DB db ^String s]
|
||||
(ClojureReaderBasedMongoDBSessionStore. db s))
|
||||
([]
|
||||
(ClojureReaderBasedMongoDBSessionStore. mg/*mongodb-database* default-session-store-collection))
|
||||
([^String s]
|
||||
(ClojureReaderBasedMongoDBSessionStore. mg/*mongodb-database* s))
|
||||
([^DB db ^String s]
|
||||
(ClojureReaderBasedMongoDBSessionStore. db s)))
|
||||
|
||||
|
||||
;; this session store won't store namespaced keywords correctly but stores results in a way
|
||||
|
|
@ -131,5 +102,9 @@
|
|||
|
||||
|
||||
(defn monger-store
|
||||
[^DB db ^String s]
|
||||
(MongoDBSessionStore. db s))
|
||||
([]
|
||||
(MongoDBSessionStore. mg/*mongodb-database* default-session-store-collection))
|
||||
([^String s]
|
||||
(MongoDBSessionStore. mg/*mongodb-database* s))
|
||||
([^DB db ^String s]
|
||||
(MongoDBSessionStore. db s)))
|
||||
|
|
|
|||
37
src/clojure/monger/search.clj
Normal file
37
src/clojure/monger/search.clj
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
(ns monger.search
|
||||
"Full text search queries support (MongoDB 2.4+)"
|
||||
(:require [monger.command :as cmd]
|
||||
[monger.conversion :as cnv])
|
||||
(:import [com.mongodb CommandResult BasicDBList DBObject]))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(defn- convert-hit
|
||||
[^DBObject dbo keywordize-keys?]
|
||||
(cnv/from-db-object dbo keywordize-keys?))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn search
|
||||
"Performs a full text search query"
|
||||
[^String collection query]
|
||||
(cmd/search collection query))
|
||||
|
||||
(defn results-from
|
||||
"Returns a lazy sequence of results from a search query response, sorted by score.
|
||||
|
||||
Each result is a Clojure map with two keys: :score and :obj."
|
||||
([^CommandResult res]
|
||||
(results-from res true))
|
||||
([^CommandResult res keywordize-keys?]
|
||||
(let [sorter (if keywordize-keys?
|
||||
:score
|
||||
(fn [m]
|
||||
(get m "score")))]
|
||||
(sort-by sorter >
|
||||
(map #(convert-hit % keywordize-keys?) ^BasicDBList (.get res "results"))))))
|
||||
44
src/clojure/monger/testkit.clj
Normal file
44
src/clojure/monger/testkit.clj
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns ^{:doc "Automated testing helpers"}
|
||||
monger.testkit
|
||||
(:require [monger.collection :as mc]))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defmacro defcleaner
|
||||
"Defines a fixture function that removes all documents from a collection. If collection is not specified,
|
||||
a conventionally named var will be used. Supposed to be used with clojure.test/use-fixtures but may
|
||||
be useful on its own.
|
||||
|
||||
Examples:
|
||||
|
||||
(defcleaner events) ;; collection name will be taken from the events-collection var
|
||||
(defcleaner people \"accounts\") ;; collection name is given
|
||||
"
|
||||
([entities]
|
||||
(let [coll-arg (symbol (str entities "-collection"))
|
||||
fn-name (symbol (str "purge-" entities))]
|
||||
`(defn ~fn-name
|
||||
[f#]
|
||||
(mc/remove ~coll-arg)
|
||||
(f#)
|
||||
(mc/remove ~coll-arg))))
|
||||
([entities coll-name]
|
||||
(let [coll-arg (name coll-name)
|
||||
fn-name (symbol (str "purge-" entities))]
|
||||
`(defn ~fn-name
|
||||
[f#]
|
||||
(mc/remove ~coll-arg)
|
||||
(f#)
|
||||
(mc/remove ~coll-arg)))))
|
||||
|
|
@ -1,45 +1,19 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns ^{:doc "Provides various utility functions, primarily for working with document ids."} monger.util
|
||||
(:refer-clojure :exclude [random-uuid])
|
||||
(:import java.security.SecureRandom
|
||||
java.math.BigInteger
|
||||
org.bson.types.ObjectId
|
||||
com.mongodb.DBObject
|
||||
clojure.lang.IPersistentMap
|
||||
java.util.Map)
|
||||
(:refer-clojure :exclude [random-uuid]))
|
||||
java.util.Map))
|
||||
|
||||
;;
|
||||
;; API
|
||||
|
|
@ -56,11 +30,9 @@
|
|||
(.toString (new BigInteger n (SecureRandom.)) num-base))
|
||||
|
||||
(defn ^ObjectId object-id
|
||||
"Returns a new BSON object id, or converts str to BSON object id"
|
||||
([]
|
||||
(ObjectId.))
|
||||
([^String s]
|
||||
(ObjectId. s)))
|
||||
"Returns a new BSON object id"
|
||||
[]
|
||||
(ObjectId.))
|
||||
|
||||
(defprotocol GetDocumentId
|
||||
(get-id [input] "Returns document id"))
|
||||
|
|
@ -75,8 +47,3 @@
|
|||
(get-id
|
||||
[^IPersistentMap object]
|
||||
(or (:_id object) (object "_id"))))
|
||||
|
||||
(defn into-array-list
|
||||
"Coerce a j.u.Collection into a j.u.ArrayList."
|
||||
^java.util.ArrayList [^java.util.Collection coll]
|
||||
(java.util.ArrayList. coll))
|
||||
|
|
|
|||
44
src/java/com/novemberain/monger/DBRef.java
Normal file
44
src/java/com/novemberain/monger/DBRef.java
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
package com.novemberain.monger;
|
||||
|
||||
import clojure.lang.IDeref;
|
||||
import com.mongodb.DB;
|
||||
import com.mongodb.DBObject;
|
||||
import org.bson.BSONObject;
|
||||
|
||||
/**
|
||||
* Exactly as com.mongodb.DBRef but also implements Clojure IDeref for @dereferencing
|
||||
*/
|
||||
public class DBRef extends com.mongodb.DBRef implements IDeref {
|
||||
|
||||
/**
|
||||
* Creates a DBRef
|
||||
* @param db the database
|
||||
* @param o a BSON object representing the reference
|
||||
*/
|
||||
public DBRef(DB db, BSONObject o) {
|
||||
super(db , o.get("$ref").toString(), o.get("$id"));
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a DBRef
|
||||
* @param db the database
|
||||
* @param ns the namespace where the object is stored
|
||||
* @param id the object id
|
||||
*/
|
||||
public DBRef(DB db, String ns, Object id) {
|
||||
super(db, ns, id);
|
||||
}
|
||||
|
||||
/**
|
||||
* Creates a DBRef from a com.mongodb.DBRef instance.
|
||||
* @param source The original reference MongoDB Java driver uses
|
||||
*/
|
||||
public DBRef(com.mongodb.DBRef source) {
|
||||
this(source.getDB(), source.getRef(), source.getId());
|
||||
}
|
||||
|
||||
@Override
|
||||
public DBObject deref() {
|
||||
return this.fetch();
|
||||
}
|
||||
}
|
||||
|
|
@ -1,136 +1,69 @@
|
|||
(ns monger.test.aggregation-framework-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")
|
||||
coll "docs"]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/purge-many db [coll])
|
||||
(f)
|
||||
(mc/purge-many db [coll]))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
(deftest test-basic-single-stage-$project-aggregation-no-keywordize
|
||||
(let [batch [{"state" "CA" "quantity" 1 "price" 199.00}
|
||||
{"state" "NY" "quantity" 2 "price" 199.00}
|
||||
{"state" "NY" "quantity" 1 "price" 299.00}
|
||||
{"state" "IL" "quantity" 2 "price" 11.50 }
|
||||
{"state" "CA" "quantity" 2 "price" 2.95 }
|
||||
{"state" "IL" "quantity" 3 "price" 5.50 }]
|
||||
expected #{{"quantity" 1 "state" "CA"}
|
||||
{"quantity" 2 "state" "NY"}
|
||||
{"quantity" 1 "state" "NY"}
|
||||
{"quantity" 2 "state" "IL"}
|
||||
{"quantity" 2 "state" "CA"}
|
||||
{"quantity" 3 "state" "IL"}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= 6 (mc/count db coll)))
|
||||
(let [result (->>
|
||||
(mc/aggregate db coll [{$project {"state" 1 "quantity" 1}}] :keywordize false)
|
||||
(map #(select-keys % ["state" "quantity"]))
|
||||
(set))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-basic-single-stage-$project-aggregation
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:quantity 1 :state "CA"}
|
||||
{:quantity 2 :state "NY"}
|
||||
{:quantity 1 :state "NY"}
|
||||
{:quantity 2 :state "IL"}
|
||||
{:quantity 2 :state "CA"}
|
||||
{:quantity 3 :state "IL"}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= 6 (mc/count db coll)))
|
||||
(let [result (set (map #(select-keys % [:state :quantity])
|
||||
(mc/aggregate db coll [{$project {:state 1 :quantity 1}}])))]
|
||||
(is (= expected result)))))
|
||||
(:require monger.core [monger.collection :as mc]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.test.fixtures))
|
||||
|
||||
|
||||
(deftest test-basic-projection-with-multiplication
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:_id "NY" :subtotal 398.0}
|
||||
{:_id "NY" :subtotal 299.0}
|
||||
{:_id "IL" :subtotal 23.0}
|
||||
{:_id "CA" :subtotal 5.9}
|
||||
{:_id "IL" :subtotal 16.5}
|
||||
{:_id "CA" :subtotal 199.0}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (set (mc/aggregate db coll [{$project {:subtotal {$multiply ["$quantity", "$price"]}
|
||||
:_id "$state"}}]))]
|
||||
(is (= expected result)))))
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-docs)
|
||||
|
||||
(deftest ^{:edge-features true} test-basic-single-stage-$project-aggregation
|
||||
(let [collection "docs"
|
||||
batch [{ :state "CA" :quantity 1 :price 199.00 }
|
||||
{ :state "NY" :quantity 2 :price 199.00 }
|
||||
{ :state "NY" :quantity 1 :price 299.00 }
|
||||
{ :state "IL" :quantity 2 :price 11.50 }
|
||||
{ :state "CA" :quantity 2 :price 2.95 }
|
||||
{ :state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:quantity 1 :state "CA"}
|
||||
{:quantity 2 :state "NY"}
|
||||
{:quantity 1 :state "NY"}
|
||||
{:quantity 2 :state "IL"}
|
||||
{:quantity 2 :state "CA"}
|
||||
{:quantity 3 :state "IL"}}]
|
||||
(mc/insert-batch collection batch)
|
||||
(is (= 6 (mc/count collection)))
|
||||
(let [result (set (map #(select-keys % [:state :quantity])
|
||||
(mc/aggregate "docs" [{$project {:state 1 :quantity 1}}])))]
|
||||
(is (= expected result)))))
|
||||
|
||||
|
||||
(deftest test-basic-total-aggregation
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:_id "CA" :total 204.9} {:_id "IL" :total 39.5} {:_id "NY" :total 697.0}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (set (mc/aggregate db coll [{$project {:subtotal {$multiply ["$quantity", "$price"]}
|
||||
:_id 1
|
||||
:state 1}}
|
||||
{$group {:_id "$state"
|
||||
:total {$sum "$subtotal"}}}]))]
|
||||
(is (= expected result)))))
|
||||
(deftest ^{:edge-features true} test-basic-projection-with-multiplication
|
||||
(let [collection "docs"
|
||||
batch [{ :state "CA" :quantity 1 :price 199.00 }
|
||||
{ :state "NY" :quantity 2 :price 199.00 }
|
||||
{ :state "NY" :quantity 1 :price 299.00 }
|
||||
{ :state "IL" :quantity 2 :price 11.50 }
|
||||
{ :state "CA" :quantity 2 :price 2.95 }
|
||||
{ :state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:_id "NY" :subtotal 398.0}
|
||||
{:_id "NY" :subtotal 299.0}
|
||||
{:_id "IL" :subtotal 23.0}
|
||||
{:_id "CA" :subtotal 5.9}
|
||||
{:_id "IL" :subtotal 16.5}
|
||||
{:_id "CA" :subtotal 199.0}}]
|
||||
(mc/insert-batch collection batch)
|
||||
(let [result (set (mc/aggregate "docs" [{$project {:subtotal {$multiply ["$quantity", "$price"]}
|
||||
:_id "$state"}}]))]
|
||||
(is (= expected result)))))
|
||||
|
||||
|
||||
(deftest test-$first-aggregation-operator
|
||||
(let [batch [{:state "CA"}
|
||||
{:state "IL"}]
|
||||
expected "CA"]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (:state (first (mc/aggregate db coll [{$group {:_id 1 :state {$first "$state"}}}])))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-$last-aggregation-operator
|
||||
(let [batch [{:state "CA"}
|
||||
{:state "IL"}]
|
||||
expected "IL"]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (:state (first (mc/aggregate db coll [{$group {:_id 1 :state {$last "$state"}}}])))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-cursor-aggregation
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:quantity 1 :state "CA"}
|
||||
{:quantity 2 :state "NY"}
|
||||
{:quantity 1 :state "NY"}
|
||||
{:quantity 2 :state "IL"}
|
||||
{:quantity 2 :state "CA"}
|
||||
{:quantity 3 :state "IL"}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= 6 (mc/count db coll)))
|
||||
(let [result (set (map #(select-keys % [:state :quantity])
|
||||
(mc/aggregate db coll [{$project {:state 1 :quantity 1}}] :cursor {:batch-size 10})))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-explain-aggregate
|
||||
(let [batch [{:state "CA" :price 100}
|
||||
{:state "CA" :price 10}
|
||||
{:state "IL" :price 50}]]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (mc/explain-aggregate db coll [{$match {:state "CA"}}])]
|
||||
(is (:ok result))))))
|
||||
(deftest ^{:edge-features true} test-basic-total-aggregation
|
||||
(let [collection "docs"
|
||||
batch [{ :state "CA" :quantity 1 :price 199.00 }
|
||||
{ :state "NY" :quantity 2 :price 199.00 }
|
||||
{ :state "NY" :quantity 1 :price 299.00 }
|
||||
{ :state "IL" :quantity 2 :price 11.50 }
|
||||
{ :state "CA" :quantity 2 :price 2.95 }
|
||||
{ :state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:_id "CA" :total 204.9} {:_id "IL" :total 39.5} {:_id "NY" :total 697.0}}]
|
||||
(mc/insert-batch collection batch)
|
||||
(let [result (set (mc/aggregate "docs" [{$project {:subtotal {$multiply ["$quantity", "$price"]}
|
||||
:_id 1
|
||||
:state 1}}
|
||||
{$group {:_id "$state"
|
||||
:total {$sum "$subtotal"}}}]))]
|
||||
(is (= expected result)))))
|
||||
|
|
|
|||
|
|
@ -1,461 +1,367 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.atomic-modifiers-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBObject]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.result :refer [acknowledged?]]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-docs purge-things purge-scores)
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
;;
|
||||
;; $inc
|
||||
;;
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "scores")
|
||||
(f)
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "scores"))
|
||||
(deftest increment-a-single-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" :score 100 })
|
||||
(mgcol/update coll { :_id oid } { $inc { :score 20 } })
|
||||
(is (= 120 (:score (mgcol/find-map-by-id coll oid))))))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
;;
|
||||
;; $inc
|
||||
;;
|
||||
|
||||
(deftest increment-a-single-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y" :score 100})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 20}})
|
||||
(is (= 120 (:score (mc/find-map-by-id db coll oid))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y"})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 30}})
|
||||
(is (= 30 (:score (mc/find-map-by-id db coll oid))))))
|
||||
(deftest set-a-single-non-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" })
|
||||
(mgcol/update coll { :_id oid } { $inc { :score 30 } })
|
||||
(is (= 30 (:score (mgcol/find-map-by-id coll oid))))))
|
||||
|
||||
|
||||
(deftest increment-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y" :score 100 :bonus 0})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 20 :bonus 10}})
|
||||
(is (= {:_id oid :score 120 :bonus 10 :username "l33r0y"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
(deftest increment-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" :score 100 :bonus 0 })
|
||||
(mgcol/update coll { :_id oid } {$inc { :score 20 :bonus 10 } })
|
||||
(is (= { :_id oid :score 120 :bonus 10 :username "l33r0y" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest increment-and-set-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y" :score 100})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 20 :bonus 10}})
|
||||
(is (= {:_id oid :score 120 :bonus 10 :username "l33r0y"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
(deftest increment-and-set-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" :score 100 })
|
||||
(mgcol/update coll { :_id oid } { $inc { :score 20 :bonus 10 } })
|
||||
(is (= { :_id oid :score 120 :bonus 10 :username "l33r0y" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; $set
|
||||
;;
|
||||
;;
|
||||
;; $set
|
||||
;;
|
||||
|
||||
(deftest update-a-single-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0})
|
||||
(mc/update db coll {:_id oid} {$set {:weight 20.5}})
|
||||
(is (= 20.5 (:weight (mc/find-map-by-id db coll oid [:weight]))))))
|
||||
(deftest update-a-single-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update coll { :_id oid } { $set { :weight 20.5 } })
|
||||
(is (= 20.5 (:weight (mgcol/find-map-by-id coll oid [:weight]))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0})
|
||||
(mc/update db coll {:_id oid} {$set {:height 17.2}})
|
||||
(is (= 17.2 (:height (mc/find-map-by-id db coll oid [:height]))))))
|
||||
(deftest set-a-single-non-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update coll { :_id oid } { $set { :height 17.2 } })
|
||||
(is (= 17.2 (:height (mgcol/find-map-by-id coll oid [:height]))))))
|
||||
|
||||
(deftest update-multiple-existing-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0 :height 15.2})
|
||||
(mc/update db coll {:_id oid} {$set {:weight 20.5 :height 25.6}})
|
||||
(is (= {:_id oid :weight 20.5 :height 25.6}
|
||||
(mc/find-map-by-id db coll oid [:weight :height])))))
|
||||
(deftest update-multiple-existing-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 :height 15.2 })
|
||||
(mgcol/update coll { :_id oid } { $set { :weight 20.5 :height 25.6 } })
|
||||
(is (= { :_id oid :weight 20.5 :height 25.6 } (mgcol/find-map-by-id coll oid [:weight :height])))))
|
||||
|
||||
|
||||
(deftest update-and-set-multiple-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0})
|
||||
(mc/update db coll {:_id oid} {$set {:weight 20.5 :height 25.6}})
|
||||
(is (= {:_id oid :weight 20.5 :height 25.6}
|
||||
(mc/find-map-by-id db coll oid [:weight :height])))))
|
||||
(deftest update-and-set-multiple-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update coll { :_id oid } {$set { :weight 20.5 :height 25.6 } })
|
||||
(is (= { :_id oid :weight 20.5 :height 25.6 } (mgcol/find-map-by-id coll oid [:weight :height])))))
|
||||
|
||||
|
||||
;;
|
||||
;; $unset
|
||||
;;
|
||||
;;
|
||||
;; $unset
|
||||
;;
|
||||
|
||||
(deftest unset-a-single-existing-field-using-$unset-modifier
|
||||
(deftest unset-a-single-existing-field-using-$unset-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :title "Document 1" :published true })
|
||||
(mgcol/update coll { :_id oid } { $unset { :published 1 } })
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest unset-multiple-existing-fields-using-$unset-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :title "Document 1" :published true :featured true })
|
||||
(mgcol/update coll { :_id oid } { $unset { :published 1 :featured true } })
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest unsetting-an-unexisting-field-using-$unset-modifier-is-not-considered-an-issue
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :title "Document 1" :published true })
|
||||
(is (mgres/ok? (mgcol/update coll { :_id oid } { $unset { :published 1 :featured true } })))
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
;;
|
||||
;; $setOnInsert
|
||||
;;
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-non-existing-document
|
||||
(let [coll "docs"
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mgcol/find-and-modify coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} :upsert true)
|
||||
(is (= { :_id oid :lastseen now :firstseen now} (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-existing-document
|
||||
(let [coll "docs"
|
||||
before 123
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :firstseen before :lastseen before})
|
||||
(mgcol/find-and-modify coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} :upsert true)
|
||||
(is (= { :_id oid :lastseen now :firstseen before} (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;; this is a common mistake, I leave it here to demonstrate it. You almost never
|
||||
;; actually want to do this! What you really want is to use $pushAll instead of $push. MK.
|
||||
(deftest add-array-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags ["modifiers" "operators"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" ["modifiers" "operators"]] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
;;
|
||||
;; $pushAll
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$pushAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert coll { :_id oid :title title })
|
||||
(mgcol/update coll { :_id oid } { $pushAll { :tags ["mongodb" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "docs"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$pushAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $pushAll { :tags ["modifiers" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers" "docs"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$pushAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb" "docs"] })
|
||||
(mgcol/update coll { :_id oid } { $pushAll { :tags ["modifiers" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "docs" "modifiers" "docs"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $addToSet
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert coll { :_id oid :title title })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pop
|
||||
;;
|
||||
|
||||
(deftest pop-last-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["products" "apple" "reviews"] })
|
||||
(mgcol/update coll { :_id oid } { $pop { :tags 1 } })
|
||||
(is (= { :_id oid :title title :tags ["products" "apple"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest unshift-first-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["products" "apple" "reviews"] })
|
||||
(mgcol/update coll { :_id oid } { $pop { :tags -1 } })
|
||||
(is (= { :_id oid :title title :tags ["apple" "reviews"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest pop-last-values-from-multiple-arrays-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["products" "apple" "reviews"] :categories ["apple" "reviews" "drafts"] })
|
||||
(mgcol/update coll { :_id oid } { $pop { :tags 1 :categories 1 } })
|
||||
(is (= { :_id oid :title title :tags ["products" "apple"] :categories ["apple" "reviews"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pull
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update coll { :_id oid } { $pull { :measurements 1.2 } })
|
||||
(is (= { :_id oid :title title :measurements [1.0 1.1 1.1 1.3 1.0] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier-based-on-a-condition
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update coll { :_id oid } { $pull { :measurements { $gte 1.2 } } })
|
||||
(is (= { :_id oid :title title :measurements [1.0 1.1 1.1 1.0] } (mgcol/find-map-by-id coll oid)))))
|
||||
;;
|
||||
;; $pullAll
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pullAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pullAll modifier removes entries of multiple values in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update coll { :_id oid } { $pullAll { :measurements [1.0 1.1 1.2] } })
|
||||
(is (= { :_id oid :title title :measurements [1.3] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $rename
|
||||
;;
|
||||
|
||||
(deftest rename-a-single-field-using-$rename-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$rename renames fields"
|
||||
v [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements v })
|
||||
(mgcol/update coll { :_id oid } { $rename { :measurements "results" } })
|
||||
(is (= { :_id oid :title title :results v } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; find-and-modify
|
||||
;;
|
||||
|
||||
(deftest find-and-modify-a-single-document
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}
|
||||
update {$inc {:level 1}}]
|
||||
(mgcol/insert coll doc)
|
||||
(let [res (mgcol/find-and-modify coll conditions update :return-new true)]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 43})))))
|
||||
|
||||
|
||||
(deftest find-and-modify-remove-a-document
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}]
|
||||
(mgcol/insert coll doc)
|
||||
(let [res (mgcol/find-and-modify coll conditions {} :remove true)]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42}))
|
||||
(is (empty? (mgcol/find-maps coll conditions))))))
|
||||
|
||||
|
||||
(deftest find-and-modify-upsert-a-document
|
||||
(testing "case 1"
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :title "Document 1" :published true})
|
||||
(mc/update db coll {:_id oid} {$unset {:published 1}})
|
||||
(is (= {:_id oid :title "Document 1"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest unset-multiple-existing-fields-using-$unset-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :title "Document 1" :published true :featured true})
|
||||
(mc/update db coll {:_id oid} {$unset {:published 1 :featured true}})
|
||||
(is (= {:_id oid :title "Document 1"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest unsetting-an-unexisting-field-using-$unset-modifier-is-not-considered-an-issue
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :title "Document 1" :published true})
|
||||
(is (acknowledged? (mc/update db coll {:_id oid} {$unset {:published 1 :featured true}})))
|
||||
(is (= {:_id oid :title "Document 1"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $setOnInsert
|
||||
;;
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-non-existing-document
|
||||
(let [coll "docs"
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mc/find-and-modify db coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} {:upsert true})
|
||||
(is (= {:_id oid :lastseen now :firstseen now}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-existing-document
|
||||
(let [coll "docs"
|
||||
before 123
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :firstseen before :lastseen before})
|
||||
(mc/find-and-modify db coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} {:upsert true})
|
||||
(is (= {:_id oid :lastseen now :firstseen before}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-modifier
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}]
|
||||
(let [res (mgcol/find-and-modify coll doc doc :upsert true)]
|
||||
(is (empty? res))
|
||||
(is (select-keys (mgcol/find-map-by-id coll oid) [:name :level]) (dissoc doc :_id)))))
|
||||
(testing "case 2"
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
query {:name "Sophie Bangs"}
|
||||
doc (merge query {:level 42})]
|
||||
(let [res (mgcol/find-and-modify coll query doc :upsert true :return-new true)]
|
||||
(is (:_id res))
|
||||
(is (select-keys (mgcol/find-map-by-id coll (:_id res)) [:name :level]) doc)))))
|
||||
|
||||
|
||||
;; this is a common mistake, I leave it here to demonstrate it. You almost never
|
||||
;; actually want to do this! What you really want is to use $push with $each instead of $push. MK.
|
||||
(deftest add-array-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags ["modifiers" "operators"]}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" ["modifiers" "operators"]]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push $each
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push with $each modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["mongodb" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-values-to-an-existing-array-using-$push-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push with $each modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push with $each modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb" "docs"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push + $each (formerly $pushAll)
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-and-$each-modifiers
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["mongodb" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$push-and-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-and-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb" "docs"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $addToSet
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $addToSet $each
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$addToSet-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet with $each modifier appends multiple values to field unless they are already there"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags {$each ["mongodb" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-values-to-an-existing-array-using-$addToSet-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet with $each modifier appends multiple values to field unless they are already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$addToSet-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet with $each modifier appends multiple values to field unless they are already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb" "docs"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags {$each ["modifiers" "docs" "operators"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs" "modifiers" "operators"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $pop
|
||||
;;
|
||||
|
||||
(deftest pop-last-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["products" "apple" "reviews"]})
|
||||
(mc/update db coll {:_id oid} {$pop {:tags 1}})
|
||||
(is (= {:_id oid :title title :tags ["products" "apple"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest unshift-first-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["products" "apple" "reviews"]})
|
||||
(mc/update db coll {:_id oid} {$pop {:tags -1}})
|
||||
(is (= {:_id oid :title title :tags ["apple" "reviews"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest pop-last-values-from-multiple-arrays-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["products" "apple" "reviews"] :categories ["apple" "reviews" "drafts"]})
|
||||
(mc/update db coll {:_id oid} {$pop {:tags 1 :categories 1}})
|
||||
(is (= {:_id oid :title title :tags ["products" "apple"] :categories ["apple" "reviews"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pull
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]})
|
||||
(mc/update db coll {:_id oid} {$pull {:measurements 1.2}})
|
||||
(is (= {:_id oid :title title :measurements [1.0 1.1 1.1 1.3 1.0]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier-based-on-a-condition
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]})
|
||||
(mc/update db coll {:_id oid} {$pull {:measurements {$gte 1.2}}})
|
||||
(is (= {:_id oid :title title :measurements [1.0 1.1 1.1 1.0]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
;;
|
||||
;; $pullAll
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pullAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pullAll modifier removes entries of multiple values in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]})
|
||||
(mc/update db coll {:_id oid} {$pullAll {:measurements [1.0 1.1 1.2]}})
|
||||
(is (= {:_id oid :title title :measurements [1.3]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $rename
|
||||
;;
|
||||
|
||||
(deftest rename-a-single-field-using-$rename-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$rename renames fields"
|
||||
v [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]]
|
||||
(mc/insert db coll {:_id oid :title title :measurements v})
|
||||
(mc/update db coll {:_id oid} {$rename {:measurements "results"}})
|
||||
(is (= {:_id oid :title title :results v}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; find-and-modify
|
||||
;;
|
||||
|
||||
(deftest find-and-modify-a-single-document
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}
|
||||
update {$inc {:level 1}}]
|
||||
(mc/insert db coll doc)
|
||||
(let [res (mc/find-and-modify db coll conditions update {:return-new true})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 43})))))
|
||||
|
||||
|
||||
(deftest find-and-modify-remove-a-document
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}]
|
||||
(mc/insert db coll doc)
|
||||
(let [res (mc/find-and-modify db coll conditions {} {:remove true})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42}))
|
||||
(is (empty? (mc/find-maps db coll conditions))))))
|
||||
|
||||
|
||||
(deftest find-and-modify-upsert-a-document
|
||||
(testing "case 1"
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}]
|
||||
(let [res (mc/find-and-modify db coll doc doc {:upsert true})]
|
||||
(is (empty? res))
|
||||
(is (select-keys (mc/find-map-by-id db coll oid) [:name :level]) (dissoc doc :_id)))))
|
||||
(testing "case 2"
|
||||
(let [coll "docs"
|
||||
query {:name "Sophie Bangs"}
|
||||
doc (merge query {:level 42})]
|
||||
(let [res (mc/find-and-modify db coll query doc {:upsert true :return-new true})]
|
||||
(is (:_id res))
|
||||
(is (select-keys (mc/find-map-by-id db coll (:_id res)) [:name :level]) doc)))))
|
||||
|
||||
|
||||
(deftest find-and-modify-after-sort
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
oid2 (ObjectId.)
|
||||
doc {:name "Sophie Bangs"}
|
||||
doc1 (assoc doc :_id oid :level 42)
|
||||
doc2 (assoc doc :_id oid2 :level 0)]
|
||||
(mc/insert-batch db coll [doc1 doc2])
|
||||
(let [res (mc/find-and-modify db coll doc {$inc {:level 1}} {:sort {:level -1}})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42}))))))
|
||||
(deftest find-and-modify-after-sort
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
oid2 (ObjectId.)
|
||||
doc {:name "Sophie Bangs"}
|
||||
doc1 (assoc doc :_id oid :level 42)
|
||||
doc2 (assoc doc :_id oid2 :level 0)]
|
||||
(mgcol/insert-batch coll [doc1 doc2])
|
||||
(let [res (mgcol/find-and-modify coll doc {$inc {:level 1}} :sort {:level -1})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42})))))
|
||||
|
|
|
|||
|
|
@ -1,42 +1,52 @@
|
|||
(ns monger.test.authentication-test
|
||||
(:require [monger util db]
|
||||
[monger.credentials :as mcr]
|
||||
[monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]))
|
||||
(:require [monger core util db]
|
||||
[monger.test.helper :as helper]
|
||||
[monger.collection :as mc])
|
||||
(:use clojure.test))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
;;
|
||||
;; Connection via URI
|
||||
;;
|
||||
|
||||
(when-not (System/getenv "CI")
|
||||
(deftest ^{:authentication true} connect-to-mongo-via-uri-without-credentials
|
||||
(let [{:keys [conn db]} (mg/connect-via-uri "mongodb://127.0.0.1/monger-test4")]
|
||||
(is (-> conn .getAddress (.sameHost "127.0.0.1")))))
|
||||
(let [connection (monger.core/connect-via-uri! "mongodb://127.0.0.1/monger-test4")]
|
||||
(is (= (-> connection .getAddress ^InetAddress (.sameHost "127.0.0.1")))))
|
||||
;; reconnect using regular host
|
||||
(helper/connect!))
|
||||
|
||||
(deftest ^{:authentication true} connect-to-mongo-via-uri-with-valid-credentials
|
||||
(let [{:keys [conn db]} (mg/connect-via-uri "mongodb://clojurewerkz%2Fmonger:monger@127.0.0.1/monger-test4")]
|
||||
(is (= "monger-test4" (.getName db)))
|
||||
(is (-> conn .getAddress (.sameHost "127.0.0.1")))
|
||||
(mc/remove db "documents")
|
||||
(let [connection (monger.core/connect-via-uri! "mongodb://clojurewerkz/monger:monger@127.0.0.1/monger-test4")]
|
||||
(is (= "monger-test4" (.getName (monger.core/current-db))))
|
||||
(is (= (-> connection .getAddress ^InetAddress (.sameHost "127.0.0.1"))))
|
||||
(mc/remove "documents")
|
||||
;; make sure that the database is selected
|
||||
;; and operations get through.
|
||||
(mc/insert db "documents" {:field "value"})
|
||||
(is (= 1 (mc/count db "documents" {}))))))
|
||||
(mc/insert "documents" {:field "value"})
|
||||
(is (= 1 (mc/count "documents" {}))))
|
||||
;; reconnect using regular host
|
||||
(helper/connect!)))
|
||||
|
||||
(if-let [uri (System/getenv "MONGOHQ_URL")]
|
||||
(deftest ^{:external true :authentication true} connect-to-mongo-via-uri-with-valid-credentials
|
||||
(let [{:keys [conn db]} (mg/connect-via-uri uri)]
|
||||
(is (-> conn .getAddress (.sameHost "127.0.0.1"))))))
|
||||
(let [connection (monger.core/connect-via-uri! uri)]
|
||||
(is (= (-> connection .getAddress ^InetAddress (.sameHost "127.0.0.1")))))
|
||||
;; reconnect using regular host
|
||||
(helper/connect!)))
|
||||
|
||||
|
||||
;;
|
||||
;; Regular connecton
|
||||
;;
|
||||
(deftest ^{:authentication true} test-authentication-with-valid-credentials-on-the-default-db
|
||||
;; see ./bin/ci/before_script.sh. MK.
|
||||
(let [username "clojurewerkz/monger"
|
||||
pwd "monger"]
|
||||
(is (monger.core/authenticate username (.toCharArray pwd)))))
|
||||
|
||||
(deftest ^{:authentication true} test-authentication-with-valid-credentials
|
||||
;; see ./bin/ci/before_script.sh. MK.
|
||||
(doseq [s ["monger-test" "monger-test2" "monger-test3" "monger-test4"]]
|
||||
(let [creds (mcr/create "clojurewerkz/monger" "monger-test" "monger")
|
||||
conn (mg/connect-with-credentials "127.0.0.1" creds)]
|
||||
(mc/remove (mg/get-db conn "monger-test") "documents"))))
|
||||
(deftest ^{:authentication true} test-authentication-with-valid-credentials-on-an-arbitrary-db
|
||||
;; see ./bin/ci/before_script.sh. MK.
|
||||
(let [username "clojurewerkz/monger"
|
||||
pwd "monger"]
|
||||
(is (monger.core/authenticate (monger.core/get-db "monger-test") username (.toCharArray pwd)))))
|
||||
|
||||
(deftest ^{:authentication true} test-authentication-with-invalid-credentials
|
||||
(let [username "monger"
|
||||
^String pwd (monger.util/random-str 128 32)]
|
||||
(is (not (monger.core/authenticate (monger.core/get-db "monger-test2") username (.toCharArray pwd))))))
|
||||
|
|
|
|||
167
test/monger/test/cache_test.clj
Normal file
167
test/monger/test/cache_test.clj
Normal file
|
|
@ -0,0 +1,167 @@
|
|||
(ns monger.test.cache-test
|
||||
(:require [monger.test.helper :as helper]
|
||||
[monger.core :as mg]
|
||||
[monger.collection :as mc])
|
||||
(:use clojure.core.cache clojure.test monger.cache)
|
||||
(:import [clojure.core.cache BasicCache FIFOCache LRUCache TTLCache]
|
||||
java.util.UUID))
|
||||
|
||||
;;
|
||||
;; Playground/Tests. These were necessary because clojure.core.cache has
|
||||
;; little documentation, incomplete test suite and
|
||||
;; slightly non-standard (although necessary to support all those cache variations)
|
||||
;; cache operations protocol.
|
||||
;;
|
||||
;; This is by no means clear or complete either but it did the job of helping me
|
||||
;; explore the API.
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-has?-with-basic-cache
|
||||
(testing "that has? returns false for misses"
|
||||
(let [c (BasicCache. {})]
|
||||
(are [v] (is (false? (has? c v)))
|
||||
:missing-key
|
||||
"missing-key"
|
||||
(gensym "missing-key"))))
|
||||
(testing "that has? returns true for hits"
|
||||
(let [c (BasicCache. {:skey "Value" :lkey (Long/valueOf 10000) "kkey" :keyword})]
|
||||
(are [v] (is (has? c v))
|
||||
:skey
|
||||
:lkey
|
||||
"kkey"))))
|
||||
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-lookup-with-basic-cache
|
||||
(testing "that lookup returns nil for misses"
|
||||
(let [c (BasicCache. {})]
|
||||
(are [v] (is (nil? (lookup c v)))
|
||||
:missing-key
|
||||
"missing-key"
|
||||
(gensym "missing-key"))))
|
||||
(testing "that lookup returns cached values for hits"
|
||||
(let [l (Long/valueOf 10000)
|
||||
c (BasicCache. {:skey "Value" :lkey l "kkey" :keyword})]
|
||||
(are [v k] (is (= v (lookup c k)))
|
||||
"Value" :skey
|
||||
l :lkey
|
||||
:keyword "kkey"))))
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-evict-with-basic-cache
|
||||
(testing "that evict has no effect for keys that do not exist"
|
||||
(let [c (atom (BasicCache. {:a 1 :b 2}))]
|
||||
(swap! c evict :missing-key)
|
||||
(is (has? @c :a))
|
||||
(is (has? @c :b))))
|
||||
(testing "that evict removes keys that did exist"
|
||||
(let [c (atom (BasicCache. {:skey "Value" "kkey" :keyword}))]
|
||||
(is (has? @c :skey))
|
||||
(is (= "Value" (lookup @c :skey)))
|
||||
(swap! c evict :skey)
|
||||
(is (not (has? @c :skey)))
|
||||
(is (= nil (lookup @c :skey)))
|
||||
(is (has? @c "kkey"))
|
||||
(is (= :keyword (lookup @c "kkey"))))))
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-seed-with-basic-cache
|
||||
(testing "that seed returns a new value"
|
||||
(let [c (atom (BasicCache. {}))]
|
||||
(swap! c seed {:a 1 :b "b" "c" :d})
|
||||
(are [k v] (do
|
||||
(is (has? @c k))
|
||||
(is (= v (lookup @c k))))
|
||||
:a 1
|
||||
:b "b"
|
||||
"c" :d))))
|
||||
|
||||
|
||||
;;
|
||||
;; Tests
|
||||
;;
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each (fn [f]
|
||||
(mc/remove "basic_monger_cache_entries")
|
||||
(let [db (mg/get-db "altcache")]
|
||||
(mc/remove db "db_aware_monger_cache_entries" {}))
|
||||
(f)
|
||||
(mc/remove "basic_monger_cache_entries")
|
||||
(let [db (mg/get-db "altcache")]
|
||||
(mc/remove db "db_aware_monger_cache_entries" {}))))
|
||||
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-has?-with-basic-monger-cache
|
||||
(testing "that has? returns false for misses"
|
||||
(let [coll "basic_monger_cache_entries"
|
||||
c (basic-monger-cache-factory coll)]
|
||||
(is (not (has? c (str (UUID/randomUUID)))))
|
||||
(is (not (has? c (str (UUID/randomUUID)))))))
|
||||
(testing "that has? returns true for hits"
|
||||
(let [coll "basic_monger_cache_entries"
|
||||
c (basic-monger-cache-factory coll {"a" 1 "b" "cache" "c" 3/4})]
|
||||
(is (has? c "a"))
|
||||
(is (has? c "b"))
|
||||
(is (has? c "c"))
|
||||
(is (not (has? c "d"))))))
|
||||
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-lookup-with-basic-moger-cache
|
||||
(testing "that lookup returns nil for misses"
|
||||
(let [coll "basic_monger_cache_entries"
|
||||
c (basic-monger-cache-factory coll)]
|
||||
(are [v] (is (nil? (lookup c v)))
|
||||
:missing-key
|
||||
"missing-key"
|
||||
(gensym "missing-key"))))
|
||||
(testing "that lookup returns cached values for hits"
|
||||
(let [l (Long/valueOf 10000)
|
||||
coll "basic_monger_cache_entries"
|
||||
c (basic-monger-cache-factory coll {:skey "Value" :lkey l "kkey" :keyword})]
|
||||
(are [v k] (is (= v (lookup c k)))
|
||||
"Value" :skey
|
||||
l :lkey
|
||||
"keyword" "kkey"))))
|
||||
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-has?-with-db-aware-monger-cache
|
||||
(testing "that has? returns false for misses"
|
||||
(let [db (mg/get-db "altcache")
|
||||
coll "db_aware_monger_cache_entries"
|
||||
c (db-aware-monger-cache-factory db coll)]
|
||||
(is (not (has? c (str (UUID/randomUUID)))))
|
||||
(is (not (has? c (str (UUID/randomUUID)))))))
|
||||
(testing "that has? returns true for hits"
|
||||
(let [db (mg/get-db "altcache")
|
||||
coll "db_aware_monger_cache_entries"
|
||||
c (db-aware-monger-cache-factory db coll {"a" 1 "b" "cache" "c" 3/4})]
|
||||
(is (has? c "a"))
|
||||
(is (has? c "b"))
|
||||
(is (has? c "c"))
|
||||
(is (not (has? c "d"))))))
|
||||
|
||||
|
||||
(deftest ^{:cache true}
|
||||
test-lookup-with-db-aware-moger-cache
|
||||
(testing "that lookup returns nil for misses"
|
||||
(let [db (mg/get-db "altcache")
|
||||
coll "db_aware_monger_cache_entries"
|
||||
c (db-aware-monger-cache-factory db coll)]
|
||||
(are [v] (is (nil? (lookup c v)))
|
||||
:missing-key
|
||||
"missing-key"
|
||||
(gensym "missing-key"))))
|
||||
(testing "that lookup returns cached values for hits"
|
||||
(let [l (Long/valueOf 10000)
|
||||
db (mg/get-db "altcache")
|
||||
coll "db_aware_monger_cache_entries"
|
||||
c (db-aware-monger-cache-factory db coll {:skey "Value" :lkey l "kkey" :keyword})]
|
||||
(are [v k] (is (= v (lookup c k)))
|
||||
"Value" :skey
|
||||
l :lkey
|
||||
"keyword" "kkey"))))
|
||||
|
|
@ -1,25 +1,36 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.capped-collections-test
|
||||
(:require [monger.core :as mg]
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
[monger.result :as mres]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.test.fixtures))
|
||||
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(defn- megabytes
|
||||
[^long n]
|
||||
(* n 1024 1024))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest test-inserting-into-capped-collection
|
||||
(let [n 1000
|
||||
cname "cached"
|
||||
_ (mc/drop db cname)
|
||||
coll (mc/create db cname {:capped true :size (-> 16 megabytes) :max n})]
|
||||
(is (= cname (.getName coll)))
|
||||
(mc/insert-batch db cname (for [i (range 0 (+ n 100))] {:i i}))
|
||||
(is (= n (mc/count db cname)))
|
||||
;; older elements get replaced by newer ones
|
||||
(is (not (mc/any? db cname {:i 1})))
|
||||
(is (not (mc/any? db cname {:i 5})))
|
||||
(is (not (mc/any? db cname {:i 9})))
|
||||
(is (mc/any? db cname {:i (+ n 80)})))))
|
||||
|
||||
;;
|
||||
;; Tests
|
||||
;;
|
||||
|
||||
(deftest test-inserting-into-capped-collection
|
||||
(let [n 1000
|
||||
cname "cached"
|
||||
_ (mc/drop cname)
|
||||
coll (mc/create cname {:capped true :size (-> 16 megabytes) :max n})]
|
||||
(is (= cname (.getName coll)))
|
||||
(mc/insert-batch cname (for [i (range 0 (+ n 100))] {:i i}))
|
||||
(is (= n (mc/count cname)))
|
||||
;; older elements get replaced by newer ones
|
||||
(is (not (mc/any? cname {:i 1})))
|
||||
(is (not (mc/any? cname {:i 5})))
|
||||
(is (not (mc/any? cname {:i 9})))
|
||||
(is (mc/any? cname {:i (+ n 80)}))))
|
||||
|
|
|
|||
|
|
@ -1,193 +1,153 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.collection-test
|
||||
(:import org.bson.types.ObjectId
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject MapReduceOutput MapReduceCommand MapReduceCommand$OutputType]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
[monger.result :as mgres]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.test.fixtures))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(helper/connect!)
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
;;
|
||||
;; count, remove
|
||||
;;
|
||||
|
||||
(deftest get-collection-size
|
||||
(let [collection "things"]
|
||||
(is (= 0 (mc/count db collection)))
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(is (mc/any? db collection))
|
||||
(is (= 3 (mc/count db collection {:language "Clojure"})))
|
||||
(is (mc/any? db collection {:language "Clojure"}))
|
||||
(is (= 1 (mc/count db collection {:language "Scala" })))
|
||||
(is (mc/any? db collection {:language "Scala"}))
|
||||
(is (= 0 (mc/count db collection {:language "Python" })))
|
||||
(is (not (mc/any? db collection {:language "Python"})))))
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
(deftest remove-all-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(mc/remove db collection)
|
||||
(is (= 0 (mc/count db collection)))))
|
||||
;;
|
||||
;; count, remove
|
||||
;;
|
||||
|
||||
(deftest get-collection-size
|
||||
(let [collection "things"]
|
||||
(is (= 0 (mc/count collection)))
|
||||
(mc/insert-batch collection [{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count collection)))
|
||||
(is (mc/any? collection))
|
||||
(is (= 3 (mc/count mg/*mongodb-database* collection {:language "Clojure"})))
|
||||
(is (mc/any? mg/*mongodb-database* collection {:language "Clojure"}))
|
||||
(is (= 1 (mc/count collection {:language "Scala" })))
|
||||
(is (mc/any? collection {:language "Scala"}))
|
||||
(is (= 0 (mc/count mg/*mongodb-database* collection {:language "Python" })))
|
||||
(is (not (mc/any? mg/*mongodb-database* collection {:language "Python"})))))
|
||||
|
||||
|
||||
(deftest remove-some-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(mc/remove db collection {:language "Clojure"})
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest remove-a-single-document-from-collection
|
||||
(let [collection "libraries"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger" :_id oid}])
|
||||
(mc/remove-by-id db collection oid)
|
||||
(is (= 0 (mc/count db collection)))
|
||||
(is (nil? (mc/find-by-id db collection oid)))))
|
||||
(deftest remove-all-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count collection)))
|
||||
(mc/remove collection)
|
||||
(is (= 0 (mc/count collection)))))
|
||||
|
||||
|
||||
;;
|
||||
;; exists?, drop, create
|
||||
;;
|
||||
(deftest remove-some-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count collection)))
|
||||
(mc/remove collection {:language "Clojure"})
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-not-exist
|
||||
(let [collection "widgets"]
|
||||
(mc/drop db collection)
|
||||
(is (false? (mc/exists? db collection)))))
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-exist
|
||||
(let [collection "widgets"]
|
||||
(mc/drop db collection)
|
||||
(mc/insert-batch db collection [{:name "widget1"}
|
||||
{:name "widget2"}])
|
||||
(is (mc/exists? db collection))
|
||||
(mc/drop db collection)
|
||||
(is (false? (mc/exists? db collection)))
|
||||
(mc/create db "widgets" {:capped true :size 100000 :max 10})
|
||||
(is (mc/exists? db collection))
|
||||
(mc/rename db collection "gadgets")
|
||||
(is (not (mc/exists? db collection)))
|
||||
(is (mc/exists? db "gadgets"))
|
||||
(mc/drop db "gadgets")))
|
||||
|
||||
;;
|
||||
;; any?, empty?
|
||||
;;
|
||||
|
||||
(deftest test-any-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (not (mc/any? db collection)))))
|
||||
|
||||
(deftest test-any-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mc/insert db collection {:language "Clojure" :name "langohr"})]
|
||||
(is (mc/any? db "things" {:language "Clojure"}))))
|
||||
|
||||
(deftest test-empty-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (mc/empty? db collection))))
|
||||
|
||||
(deftest test-empty-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mc/insert db collection {:language "Clojure" :name "langohr"})]
|
||||
(is (not (mc/empty? db "things")))))
|
||||
(deftest remove-a-single-document-from-collection
|
||||
(let [collection "libraries"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert-batch collection [{:language "Clojure" :name "monger" :_id oid}])
|
||||
(mc/remove-by-id collection oid)
|
||||
(is (= 0 (mc/count collection)))
|
||||
(is (nil? (mc/find-by-id collection oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; distinct
|
||||
;;
|
||||
;;
|
||||
;; exists?, drop, create
|
||||
;;
|
||||
|
||||
(deftest test-distinct-values
|
||||
(let [collection "widgets"
|
||||
batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]]
|
||||
(mc/insert-batch db collection batch)
|
||||
(is (= ["CA" "IL" "NY"] (sort (mc/distinct db collection :state))))
|
||||
(is (= ["CA" "IL" "NY"] (sort (mc/distinct db collection :state {}))))
|
||||
(is (= ["CA" "NY"] (sort (mc/distinct db collection :state {:price {$gt 100.00}}))))))
|
||||
(deftest checking-for-collection-existence-when-it-does-not-exist
|
||||
(let [collection "widgets"]
|
||||
(mc/drop collection)
|
||||
(is (false? (mc/exists? collection)))))
|
||||
|
||||
;;
|
||||
;; update
|
||||
;;
|
||||
(deftest checking-for-collection-existence-when-it-does-exist
|
||||
(let [collection "widgets"]
|
||||
(mc/drop collection)
|
||||
(mc/insert-batch collection [{:name "widget1"}
|
||||
{:name "widget2"}])
|
||||
(is (mc/exists? collection))
|
||||
(mc/drop collection)
|
||||
(is (false? (mc/exists? collection)))
|
||||
(mc/create "widgets" {:capped true :size 100000 :max 10})
|
||||
(is (mc/exists? collection))
|
||||
(mc/rename collection "gadgets")
|
||||
(is (not (mc/exists? collection)))
|
||||
(is (mc/exists? "gadgets"))
|
||||
(mc/drop "gadgets")))
|
||||
|
||||
(let [coll "things"
|
||||
batch [{:_id 1 :type "rock" :size "small"}
|
||||
{:_id 2 :type "bed" :size "bed-sized"}
|
||||
{:_id 3 :type "bottle" :size "1.5 liters"}]]
|
||||
;;
|
||||
;; any?, empty?
|
||||
;;
|
||||
|
||||
(deftest test-update
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= "small" (:size (mc/find-one-as-map db coll {:type "rock"}))))
|
||||
(mc/update db coll {:type "rock"} {"$set" {:size "huge"}})
|
||||
(is (= "huge" (:size (mc/find-one-as-map db coll {:type "rock"})))))
|
||||
(deftest test-any-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (not (mc/any? collection)))))
|
||||
|
||||
(deftest test-upsert
|
||||
(is (mc/empty? db coll))
|
||||
(mc/upsert db coll {:_id 4} {"$set" {:size "tiny"}})
|
||||
(is (not (mc/empty? db coll)))
|
||||
(mc/upsert db coll {:_id 4} {"$set" {:size "big"}})
|
||||
(is (= [{:_id 4 :size "big"}] (mc/find-maps db coll {:_id 4}))))
|
||||
(deftest test-any-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mc/insert collection {:language "Clojure" :name "langohr"})]
|
||||
(is (mc/any? "things"))
|
||||
(is (mc/any? mg/*mongodb-database* "things" {:language "Clojure"}))))
|
||||
|
||||
(deftest test-update-by-id
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= "bed" (:type (mc/find-one-as-map db coll {:_id 2}))))
|
||||
(mc/update-by-id db coll 2 {"$set" {:type "living room"}})
|
||||
(is (= "living room" (:type (mc/find-one-as-map db coll {:_id 2})))))
|
||||
(deftest test-empty-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (mc/empty? collection))
|
||||
(is (mc/empty? mg/*mongodb-database* collection))))
|
||||
|
||||
(deftest test-update-by-ids
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= "bed" (:type (mc/find-one-as-map db coll {:_id 2}))))
|
||||
(is (= "bottle" (:type (mc/find-one-as-map db coll {:_id 3}))))
|
||||
(mc/update-by-ids db coll [2 3] {"$set" {:type "dog"}})
|
||||
(is (= "dog" (:type (mc/find-one-as-map db coll {:_id 2}))))
|
||||
(is (= "dog" (:type (mc/find-one-as-map db coll {:_id 3}))))))
|
||||
(deftest test-empty-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mc/insert collection {:language "Clojure" :name "langohr"})]
|
||||
(is (not (mc/empty? "things")))))
|
||||
|
||||
;;
|
||||
;; miscellenous
|
||||
;;
|
||||
|
||||
(deftest test-system-collection-predicate
|
||||
(are [name] (is (mc/system-collection? name))
|
||||
"system.indexes"
|
||||
"system"
|
||||
;; we treat default GridFS collections as system ones,
|
||||
;; possibly this is a bad idea, time will tell. MK.
|
||||
"fs.chunks"
|
||||
"fs.files")
|
||||
(are [name] (is (not (mc/system-collection? name)))
|
||||
"events"
|
||||
"accounts"
|
||||
"megacorp_account"
|
||||
"myapp_development")))
|
||||
;;
|
||||
;; distinct
|
||||
;;
|
||||
|
||||
(deftest test-distinct-values
|
||||
(let [collection "widgets"
|
||||
batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]]
|
||||
(mc/insert-batch collection batch)
|
||||
(is (= ["CA" "IL" "NY"] (sort (mc/distinct mg/*mongodb-database* collection :state {}))))
|
||||
(is (= ["CA" "NY"] (sort (mc/distinct collection :state {:price {$gt 100.00}}))))))
|
||||
|
||||
|
||||
;;
|
||||
;; miscellenous
|
||||
;;
|
||||
|
||||
(deftest test-system-collection-predicate
|
||||
(are [name] (is (mc/system-collection? name))
|
||||
"system.indexes"
|
||||
"system"
|
||||
;; we treat default GridFS collections as system ones,
|
||||
;; possibly this is a bad idea, time will tell. MK.
|
||||
"fs.chunks"
|
||||
"fs.files")
|
||||
(are [name] (is (not (mc/system-collection? name)))
|
||||
"events"
|
||||
"accounts"
|
||||
"megacorp_account"
|
||||
"myapp_development"))
|
||||
|
|
|
|||
|
|
@ -1,29 +1,36 @@
|
|||
(ns monger.test.command-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.command :as mcom]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.result :refer [acknowledged?]]
|
||||
[monger.conversion :refer [from-db-object]]))
|
||||
[monger.test.helper :as helper]
|
||||
[monger.collection :as mc])
|
||||
(:use clojure.test
|
||||
[monger.result :only [ok?]]
|
||||
[monger.conversion :only [from-db-object]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest ^{:command true} test-reindex-collection
|
||||
(let [_ (mc/insert db "test" {:name "Clojure"})
|
||||
result (mcom/reindex-collection db "test")]
|
||||
(is (acknowledged? result))))
|
||||
(helper/connect!)
|
||||
|
||||
(deftest ^{:command true} test-server-status
|
||||
(let [status (mcom/server-status db)]
|
||||
(is (acknowledged? status))
|
||||
(is (not-empty status))))
|
||||
|
||||
(deftest ^{:command true} test-top
|
||||
(let [result (mcom/top conn)]
|
||||
(is (acknowledged? result))
|
||||
(is (not-empty result))))
|
||||
(deftest ^{:command true} test-reindex-collection
|
||||
(let [_ (mc/insert "test" {:name "Clojure"})
|
||||
result (mcom/reindex-collection "test")]
|
||||
(is (ok? result))
|
||||
(is (get result "indexes"))))
|
||||
|
||||
(deftest ^{:command true} test-running-is-master-as-an-arbitrary-command
|
||||
(let [raw (mg/command db {:isMaster 1})
|
||||
result (from-db-object raw true)]
|
||||
(is (acknowledged? raw)))))
|
||||
(deftest ^{:command true} test-server-status
|
||||
(let [status (mcom/server-status)]
|
||||
(is (ok? status))
|
||||
(is (not-empty status))
|
||||
(is (get status "serverUsed"))))
|
||||
|
||||
(deftest ^{:command true} test-top
|
||||
(let [result (mcom/top)]
|
||||
(is (ok? result))
|
||||
(is (not-empty result))
|
||||
(is (get result "serverUsed"))))
|
||||
|
||||
(deftest ^{:command true} test-running-is-master-as-an-arbitrary-command
|
||||
(let [raw (mg/command {:isMaster 1})
|
||||
result (from-db-object raw true)]
|
||||
(is (ok? result))
|
||||
(is (ok? raw))
|
||||
(is (:ismaster result))))
|
||||
|
|
|
|||
|
|
@ -1,11 +1,9 @@
|
|||
(ns monger.test.conversion-test
|
||||
(:require [monger core collection]
|
||||
[clojure.test :refer :all]
|
||||
[monger.conversion :refer :all])
|
||||
(:require [monger core collection])
|
||||
(:import [com.mongodb DBObject BasicDBObject BasicDBList]
|
||||
[java.util Date Calendar List ArrayList]
|
||||
org.bson.types.ObjectId
|
||||
(org.bson.types Decimal128)))
|
||||
org.bson.types.ObjectId)
|
||||
(:use clojure.test monger.conversion))
|
||||
|
||||
|
||||
;;
|
||||
|
|
@ -102,13 +100,6 @@
|
|||
(is (= 2 (from-db-object 2 false)))
|
||||
(is (= 2 (from-db-object 2 true))))
|
||||
|
||||
(deftest convert-decimal-from-dbobject
|
||||
(is (= 2.3M (from-db-object (Decimal128. 2.3M) false)))
|
||||
(is (= 2.3M (from-db-object (Decimal128. 2.3M) true)))
|
||||
(is (= 2.3M (from-db-object (Decimal128/parse "2.3") true)))
|
||||
(is (not= 2.32M (from-db-object (Decimal128/parse "2.3") true)))
|
||||
)
|
||||
|
||||
(deftest convert-float-from-dbobject
|
||||
(is (= 3.3 (from-db-object 3.3 false)))
|
||||
(is (= 3.3 (from-db-object 3.3 true))))
|
||||
|
|
@ -120,20 +111,20 @@
|
|||
(.put "name" name)
|
||||
(.put "age" age))
|
||||
output (from-db-object input false)]
|
||||
(is (= output { "name" name, "age" age }))
|
||||
(is (= (output { "name" name, "age" age })))
|
||||
(is (= (output "name") name))
|
||||
(is (nil? (output :name)))
|
||||
(is (= (output "age") age))
|
||||
(is (nil? (output "points")))))
|
||||
|
||||
(deftest convert-flat-db-object-to-map-with-keywordizing
|
||||
(deftest convert-flat-db-object-to-map-without-keywordizing
|
||||
(let [name "Michael"
|
||||
age 26
|
||||
input (doto (BasicDBObject.)
|
||||
(.put "name" name)
|
||||
(.put "age" age))
|
||||
output (from-db-object input true)]
|
||||
(is (= output { :name name, :age age }))
|
||||
(is (= (output { :name name, :age age })))
|
||||
(is (= (output :name) name))
|
||||
(is (nil? (output "name")))
|
||||
(is (= (output :age) age))
|
||||
|
|
|
|||
|
|
@ -1,27 +1,30 @@
|
|||
(ns monger.test.core-test
|
||||
(:require [monger util result]
|
||||
[monger.core :as mg :refer [server-address mongo-options]]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all])
|
||||
(:import [com.mongodb MongoClient DB WriteConcern MongoClientOptions ServerAddress]))
|
||||
(:require [monger core collection util result]
|
||||
[monger.test.helper :as helper]
|
||||
[monger.collection :as mc])
|
||||
(:import [com.mongodb MongoClient DB WriteConcern MongoClientOptions ServerAddress])
|
||||
(:use clojure.test
|
||||
[monger.core :only [server-address mongo-options]]))
|
||||
|
||||
(println (str "Using Clojure version " *clojure-version*))
|
||||
(helper/connect!)
|
||||
|
||||
(deftest connect-to-mongo-with-default-host-and-port
|
||||
(let [connection (mg/connect)]
|
||||
(let [connection (monger.core/connect)]
|
||||
(is (instance? com.mongodb.MongoClient connection))))
|
||||
|
||||
(deftest connect-and-disconnect
|
||||
(let [conn (mg/connect)]
|
||||
(mg/disconnect conn)))
|
||||
(monger.core/connect!)
|
||||
(monger.core/disconnect!)
|
||||
(monger.core/connect!))
|
||||
|
||||
(deftest connect-to-mongo-with-default-host-and-explicit-port
|
||||
(let [connection (mg/connect {:port 27017})]
|
||||
(let [connection (monger.core/connect { :port 27017 })]
|
||||
(is (instance? com.mongodb.MongoClient connection))))
|
||||
|
||||
|
||||
(deftest connect-to-mongo-with-default-port-and-explicit-host
|
||||
(let [connection (mg/connect {:host "127.0.0.1"})]
|
||||
(let [connection (monger.core/connect { :host "127.0.0.1" })]
|
||||
(is (instance? com.mongodb.MongoClient connection))))
|
||||
|
||||
(deftest test-server-address
|
||||
|
|
@ -32,58 +35,32 @@
|
|||
(is (= port (.getPort sa)))))
|
||||
|
||||
(deftest use-existing-mongo-connection
|
||||
(let [^MongoClientOptions opts (mongo-options {:threads-allowed-to-block-for-connection-multiplier 300})
|
||||
connection (MongoClient. "127.0.0.1" opts)
|
||||
db (mg/get-db connection "monger-test")]
|
||||
(mg/disconnect connection)))
|
||||
(let [^MongoClientOptions opts (mongo-options :threads-allowed-to-block-for-connection-multiplier 300)
|
||||
connection (MongoClient. "127.0.0.1" opts)]
|
||||
(monger.core/set-connection! connection)
|
||||
(is (= monger.core/*mongodb-connection* connection))))
|
||||
|
||||
(deftest connect-to-mongo-with-extra-options
|
||||
(let [^MongoClientOptions opts (mongo-options {:threads-allowed-to-block-for-connection-multiplier 300})
|
||||
^ServerAddress sa (server-address "127.0.0.1" 27017)
|
||||
conn (mg/connect sa opts)]
|
||||
(mg/disconnect conn)))
|
||||
(let [^MongoClientOptions opts (mongo-options :threads-allowed-to-block-for-connection-multiplier 300)
|
||||
^ServerAddress sa (server-address "127.0.0.1" 27017)]
|
||||
(monger.core/connect! sa opts)))
|
||||
|
||||
|
||||
(deftest get-database
|
||||
(let [connection (mg/connect)
|
||||
db (mg/get-db connection "monger-test")]
|
||||
(let [connection (monger.core/connect)
|
||||
db (monger.core/get-db connection "monger-test")]
|
||||
(is (instance? com.mongodb.DB db))))
|
||||
|
||||
|
||||
(deftest test-get-db-names
|
||||
(let [conn (mg/connect)
|
||||
dbs (mg/get-db-names conn)]
|
||||
(let [dbs (monger.core/get-db-names)]
|
||||
(is (not (empty? dbs)))
|
||||
(is (dbs "monger-test"))))
|
||||
|
||||
(deftest monger-options-test
|
||||
(let [opts {:always-use-mbeans true
|
||||
:application-name "app"
|
||||
:connect-timeout 1
|
||||
:connections-per-host 1
|
||||
:cursor-finalizer-enabled true
|
||||
:description "Description"
|
||||
:heartbeat-connect-timeout 1
|
||||
:heartbeat-frequency 1
|
||||
:heartbeat-socket-timeout 1
|
||||
:local-threshold 1
|
||||
:max-connection-idle-time 1
|
||||
:max-connection-life-time 1
|
||||
:max-wait-time 1
|
||||
:min-connections-per-host 1
|
||||
:min-heartbeat-frequency 1
|
||||
:required-replica-set-name "rs"
|
||||
:retry-writes true
|
||||
:server-selection-timeout 1
|
||||
:socket-keep-alive true
|
||||
:socket-timeout 1
|
||||
:ssl-enabled true
|
||||
:ssl-invalid-host-name-allowed true
|
||||
:threads-allowed-to-block-for-connection-multiplier 1
|
||||
:uuid-representation org.bson.UuidRepresentation/STANDARD
|
||||
:write-concern com.mongodb.WriteConcern/JOURNAL_SAFE}]
|
||||
(is (instance? com.mongodb.MongoClientOptions$Builder (mg/mongo-options-builder opts)))))
|
||||
|
||||
(deftest connect-to-uri-without-db-name
|
||||
(let [uri "mongodb://localhost:27017"]
|
||||
(is (thrown? IllegalArgumentException (mg/connect-via-uri uri)))))
|
||||
(deftest get-last-error
|
||||
(let [connection (monger.core/connect)
|
||||
db (monger.core/get-db connection "monger-test")]
|
||||
(is (monger.result/ok? (monger.core/get-last-error)))
|
||||
(is (monger.result/ok? (monger.core/get-last-error db)))
|
||||
(is (monger.result/ok? (monger.core/get-last-error db WriteConcern/NORMAL)))
|
||||
(is (monger.result/ok? (monger.core/get-last-error db 1 100 true)))))
|
||||
|
|
|
|||
|
|
@ -1,107 +0,0 @@
|
|||
(ns monger.test.cursor-test
|
||||
(:import [com.mongodb DBCursor DBObject Bytes]
|
||||
[java.util List Map])
|
||||
(:require [monger.core :as mg]
|
||||
[clojure.test :refer :all]
|
||||
[monger.cursor :refer :all]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest make-db-cursor-for-collection
|
||||
(is (= DBCursor
|
||||
(class (make-db-cursor db :docs)))))
|
||||
|
||||
(deftest getting-cursor-options-value
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (isa? (class opts) Map)))
|
||||
(is (= 0 (.getOptions db-cur))) ;;test default value
|
||||
(is (= false (:notimeout opts)))
|
||||
(is (= false (:partial opts)))
|
||||
(is (= false (:awaitdata opts)))
|
||||
(is (= false (:oplogreplay opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))))
|
||||
|
||||
(deftest adding-option-to-cursor
|
||||
(let [db-cur (make-db-cursor db :docs)]
|
||||
(add-option! db-cur :notimeout)
|
||||
(is (= (:notimeout cursor-options)
|
||||
(.getOptions db-cur)))
|
||||
(add-option! db-cur :tailable)
|
||||
(is (= (.getOptions db-cur)
|
||||
(bit-or (:notimeout cursor-options)
|
||||
(:tailable cursor-options))))))
|
||||
|
||||
(deftest remove-option-from-cursor
|
||||
(let [db-cur (make-db-cursor db :docs)]
|
||||
(add-option! db-cur :partial)
|
||||
(add-option! db-cur :awaitdata)
|
||||
;; removing not-set option should not affect result
|
||||
(remove-option! db-cur :notimeout)
|
||||
(is (= (.getOptions db-cur)
|
||||
(bit-or (:partial cursor-options)
|
||||
(:awaitdata cursor-options))))
|
||||
;; removing active option should remove correct value
|
||||
(remove-option! db-cur :awaitdata)
|
||||
(is (= (.getOptions db-cur)
|
||||
(:partial cursor-options)))))
|
||||
|
||||
|
||||
(deftest test-reset-options
|
||||
(let [db-cur (make-db-cursor db :docs)]
|
||||
(add-option! db-cur :partial)
|
||||
(is (= (.getOptions db-cur)
|
||||
(:partial cursor-options)))
|
||||
(is (= 0
|
||||
(int (.getOptions (reset-options db-cur)))))))
|
||||
|
||||
(deftest add-options-with-hashmap
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur {:notimeout true :slaveok true})
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts)))))
|
||||
|
||||
(deftest add-options-with-hashmap-and-remove-option
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur {:notimeout true :slaveok true})
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:slaveok opts)))
|
||||
;;remove key and add another option
|
||||
(add-options db-cur {:partial true :slaveok false})
|
||||
(let [opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:partial opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts))))))
|
||||
|
||||
(deftest add-options-with-list
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur [:notimeout :slaveok])
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts)))))
|
||||
|
||||
(deftest add-options-with-Bytes
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur Bytes/QUERYOPTION_NOTIMEOUT)
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts)))))
|
||||
|
||||
(deftest add-options-with-one-keyword
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur :notimeout)
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts))))))
|
||||
|
|
@ -1,31 +1,44 @@
|
|||
(ns monger.test.db-test
|
||||
(:require [monger.db :as mdb]
|
||||
[monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all])
|
||||
(:require [monger core db]
|
||||
[monger.test.helper :as helper]
|
||||
[monger.collection :as mc])
|
||||
(:import [com.mongodb Mongo DB]
|
||||
java.util.Set))
|
||||
java.util.Set)
|
||||
(:use clojure.test))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
|
||||
|
||||
;; do not run this test for CI, it complicates matters by messing up
|
||||
;; authentication for some other tests :( MK.
|
||||
(let [conn (mg/connect)]
|
||||
(when-not (System/getenv "CI")
|
||||
(deftest test-drop-database
|
||||
;; drop a secondary database here. MK.
|
||||
(let [db (mg/get-db conn "monger-test3")
|
||||
collection "test"
|
||||
_ (mc/insert db collection {:name "Clojure"})
|
||||
check (mc/count db collection)
|
||||
_ (mdb/drop-db db)]
|
||||
(when-not (System/getenv "CI")
|
||||
(deftest test-drop-database
|
||||
;; drop a secondary database here. MK.
|
||||
(monger.core/with-db (monger.core/get-db "monger-test3")
|
||||
(let [collection "test"
|
||||
_ (mc/insert collection {:name "Clojure"})
|
||||
check (mc/count collection)
|
||||
_ (monger.db/drop-db)]
|
||||
(is (= 1 check))
|
||||
(is (not (mc/exists? db collection)))
|
||||
(is (= 0 (mc/count db collection))))))
|
||||
(is (not (mc/exists? collection)))
|
||||
(is (= 0 (mc/count collection))))))
|
||||
|
||||
(deftest test-get-collection-names
|
||||
(let [db (mg/get-db conn "monger-test")]
|
||||
(mc/insert db "test-1" {:name "Clojure"})
|
||||
(mc/insert db "test-2" {:name "Clojure"})
|
||||
(let [^Set xs (mdb/get-collection-names db)]
|
||||
(is (.contains xs "test-1"))
|
||||
(is (.contains xs "test-2"))))))
|
||||
(deftest test-use-database
|
||||
(monger.core/use-db! "monger-test5")
|
||||
(is (= "monger-test5" (.getName (monger.core/current-db))))
|
||||
(let [collection "test"
|
||||
_ (mc/insert collection {:name "Clojure"})
|
||||
check (mc/count collection)
|
||||
_ (monger.db/drop-db)]
|
||||
(is (= 1 check))
|
||||
(is (not (mc/exists? collection)))
|
||||
(is (= 0 (mc/count collection))))))
|
||||
|
||||
|
||||
(deftest test-get-collection-names
|
||||
(mc/insert "test-1" {:name "Clojure"})
|
||||
(mc/insert "test-2" {:name "Clojure"})
|
||||
(let [^Set collections (monger.db/get-collection-names)]
|
||||
(is (.contains collections "test-1"))
|
||||
(is (.contains collections "test-2"))))
|
||||
|
|
|
|||
19
test/monger/test/fixtures.clj
Normal file
19
test/monger/test/fixtures.clj
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
(ns monger.test.fixtures
|
||||
(:use [monger.testkit :only [defcleaner]]))
|
||||
|
||||
;;
|
||||
;; fixture functions
|
||||
;;
|
||||
|
||||
(defcleaner people "people")
|
||||
(defcleaner docs "docs")
|
||||
(defcleaner things "things")
|
||||
(defcleaner libraries "libraries")
|
||||
(defcleaner scores "scores")
|
||||
(defcleaner locations "locations")
|
||||
(defcleaner domains "domains")
|
||||
(defcleaner pages "pages")
|
||||
|
||||
(defcleaner cached "cached")
|
||||
|
||||
(defcleaner migrations "meta.migrations")
|
||||
|
|
@ -1,28 +1,31 @@
|
|||
(ns monger.test.full-text-search-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.search :as ms]
|
||||
[monger.command :as cmd]
|
||||
[monger.operators :refer :all]
|
||||
[clojure.test :refer [deftest is use-fixtures]]
|
||||
[monger.result :refer [acknowledged?]])
|
||||
(:import com.mongodb.BasicDBObjectBuilder))
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test :only [deftest is use-fixtures]]
|
||||
monger.test.fixtures
|
||||
[monger.result :only [ok?]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")
|
||||
coll "search-docs"]
|
||||
(helper/connect!)
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/purge-many db [coll])
|
||||
(f)
|
||||
(mc/purge-many db [coll]))
|
||||
(defn enable-search
|
||||
[f]
|
||||
(is (ok? (cmd/admin-command {:setParameter "*" :textSearchEnabled true})))
|
||||
(f))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
(use-fixtures :each purge-docs)
|
||||
(use-fixtures :once enable-search)
|
||||
|
||||
(deftest ^{:search true} test-basic-full-text-search-query
|
||||
(mc/ensure-index db coll (array-map :subject "text" :content "text"))
|
||||
(mc/insert db coll {:subject "hello there" :content "this should be searchable"})
|
||||
(mc/insert db coll {:subject "untitled" :content "this is just noize"})
|
||||
(let [xs (mc/find-maps db coll {$text {$search "hello"}})]
|
||||
(is (= 1 (count xs)))
|
||||
(is (= "hello there" (-> xs first :subject))))))
|
||||
(deftest ^{:edge-features true :search true} test-basic-full-text-search-query
|
||||
(let [coll "docs"]
|
||||
(mc/ensure-index coll (array-map :subject "text" :content "text"))
|
||||
(mc/insert coll {:subject "hello there" :content "this should be searchable"})
|
||||
(mc/insert coll {:subject "untitled" :content "this is just noize"})
|
||||
(let [res (ms/search coll "hello")
|
||||
xs (ms/results-from res)]
|
||||
(is (ok? res))
|
||||
(println res)
|
||||
(is (= "hello there" (-> xs first :obj :subject)))
|
||||
(is (= 1.0 (-> xs first :score))))))
|
||||
|
|
|
|||
|
|
@ -1,212 +1,166 @@
|
|||
(ns monger.test.gridfs-test
|
||||
(:refer-clojure :exclude [count remove find])
|
||||
(:use clojure.test
|
||||
[monger.core :only [count]]
|
||||
monger.test.fixtures
|
||||
[monger operators conversion]
|
||||
[monger.gridfs :only [store make-input-file store-file filename content-type metadata]])
|
||||
(:require [monger.gridfs :as gridfs]
|
||||
[clojure.java.io :as io]
|
||||
[clojure.test :refer :all]
|
||||
[monger.core :as mg :refer [count]]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.gridfs :refer [store make-input-file store-file filename content-type metadata]])
|
||||
[monger.test.helper :as helper]
|
||||
[clojure.java.io :as io])
|
||||
(:import [java.io InputStream File FileInputStream]
|
||||
[com.mongodb.gridfs GridFS GridFSInputFile GridFSDBFile]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")
|
||||
fs (mg/get-gridfs conn "monger-test")]
|
||||
(defn purge-gridfs*
|
||||
[]
|
||||
(gridfs/remove-all fs))
|
||||
(defn purge-gridfs*
|
||||
[]
|
||||
(gridfs/remove-all))
|
||||
|
||||
(defn purge-gridfs
|
||||
[f]
|
||||
(gridfs/remove-all fs)
|
||||
(f)
|
||||
(gridfs/remove-all fs))
|
||||
(defn purge-gridfs
|
||||
[f]
|
||||
(gridfs/remove-all)
|
||||
(f)
|
||||
(gridfs/remove-all))
|
||||
|
||||
(use-fixtures :each purge-gridfs)
|
||||
(use-fixtures :each purge-gridfs)
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-relative-fs-paths
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store (make-input-file fs input)
|
||||
(.setFilename "monger.test.gridfs.file1")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-file-instances
|
||||
(let [input (io/as-file "./test/resources/mongo/js/mapfun1.js")]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file2")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-bytes-to-gridfs
|
||||
(let [input (.getBytes "A string")
|
||||
md {:format "raw" :source "AwesomeCamera D95"}
|
||||
fname "monger.test.gridfs.file3"
|
||||
ct "application/octet-stream"]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(metadata md)
|
||||
(content-type "application/octet-stream"))
|
||||
(let [f (first (gridfs/files-as-maps fs))]
|
||||
(is (= ct (:contentType f)))
|
||||
(is (= fname (:filename f)))
|
||||
(is (= md (:metadata f))))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-absolute-fs-paths
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-absolute-fs-paths")
|
||||
_ (spit tmp-file "Some content")
|
||||
input (.getAbsolutePath tmp-file)]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file4")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-input-stream
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-input-stream")
|
||||
_ (spit tmp-file "Some other content")]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file fs
|
||||
(make-input-file (FileInputStream. tmp-file))
|
||||
(filename "monger.test.gridfs.file4b")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-deleting-file-instance-on-disk-after-storing
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-deleting-file-instance-on-disk-after-storing")
|
||||
_ (spit tmp-file "to be deleted")]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs tmp-file)
|
||||
(filename "test-deleting-file-instance-on-disk-after-storing")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (.delete tmp-file))))
|
||||
(helper/connect!)
|
||||
|
||||
|
||||
|
||||
(deftest ^{:gridfs true} test-finding-individual-files-on-gridfs
|
||||
(testing "gridfs/find-one"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file5"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(are [a b] (is (= a (:md5 (from-db-object (gridfs/find-one fs b) true))))
|
||||
md5 {:_id (:_id stored)}
|
||||
md5 (to-db-object {:md5 md5}))))
|
||||
(testing "gridfs/find-one-as-map"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file6"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(metadata (to-db-object {:meta "data"}))
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(let [m (gridfs/find-one-as-map fs {:filename fname})]
|
||||
(is (= {:meta "data"} (:metadata m))))
|
||||
(are [a query] (is (= a (:md5 (gridfs/find-one-as-map fs query))))
|
||||
md5 {:_id (:_id stored)}
|
||||
md5 {:md5 md5})))
|
||||
(testing "gridfs/find-by-id"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file5"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(are [a id] (is (= a (:md5 (from-db-object (gridfs/find-by-id fs id) true))))
|
||||
md5 (:_id stored))))
|
||||
(testing "gridfs/find-map-by-id"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file6"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(metadata (to-db-object {:meta "data"}))
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(let [m (gridfs/find-map-by-id fs (:_id stored))]
|
||||
(is (= {:meta "data"} (:metadata m))))
|
||||
(are [a id] (is (= a (:md5 (gridfs/find-map-by-id fs id))))
|
||||
md5 (:_id stored)))))
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-relative-fs-paths
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file1")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-finding-multiple-files-on-gridfs
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-file-instances
|
||||
(let [input (io/as-file "./test/resources/mongo/js/mapfun1.js")]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store-file (make-input-file input)
|
||||
(filename "monger.test.gridfs.file2")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-bytes-to-gridfs
|
||||
(let [input (.getBytes "A string")
|
||||
md {:format "raw" :source "AwesomeCamera D95"}
|
||||
fname "monger.test.gridfs.file3"
|
||||
ct "application/octet-stream"]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store-file (make-input-file input)
|
||||
(filename fname)
|
||||
(metadata md)
|
||||
(content-type "application/octet-stream"))
|
||||
(let [f (first (gridfs/files-as-maps))]
|
||||
(is (= ct (:contentType f)))
|
||||
(is (= fname (:filename f)))
|
||||
(is (= md (:metadata f))))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-absolute-fs-paths
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-absolute-fs-paths")
|
||||
_ (spit tmp-file "Some content")
|
||||
input (.getAbsolutePath tmp-file)]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store-file (make-input-file input)
|
||||
(filename "monger.test.gridfs.file4")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-input-stream
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-input-stream")
|
||||
_ (spit tmp-file "Some other content")]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store-file (make-input-file (FileInputStream. tmp-file))
|
||||
(filename "monger.test.gridfs.file4b")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:gridfs true} test-finding-individual-files-on-gridfs
|
||||
(testing "gridfs/find-one"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file5"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file input)
|
||||
(filename fname)
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(are [a b] (is (= a (:md5 (from-db-object (gridfs/find-one b) true))))
|
||||
md5 (:_id stored)
|
||||
md5 fname
|
||||
md5 (to-db-object {:md5 md5}))))
|
||||
(testing "gridfs/find-one-as-map"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file6"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file6")
|
||||
(content-type ct))
|
||||
stored2 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file7")
|
||||
(content-type ct))
|
||||
list1 (gridfs/find-by-filename fs "monger.test.gridfs.file6")
|
||||
list2 (gridfs/find-by-filename fs "monger.test.gridfs.file7")
|
||||
list3 (gridfs/find-by-filename fs "888000___.monger.test.gridfs.file")
|
||||
list4 (gridfs/find-by-md5 fs md5)]
|
||||
(is (= 2 (count (gridfs/all-files fs))))
|
||||
(are [a b] (is (= (map #(.get ^GridFSDBFile % "_id") a)
|
||||
(map :_id b)))
|
||||
list1 [stored1]
|
||||
list2 [stored2]
|
||||
list3 []
|
||||
list4 [stored1 stored2])))
|
||||
stored (store-file (make-input-file input)
|
||||
(filename fname)
|
||||
(metadata (to-db-object {:meta "data"}))
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(let [m (gridfs/find-one-as-map {:filename fname})]
|
||||
(is (= {:meta "data"} (:metadata m))))
|
||||
(are [a query] (is (= a (:md5 (gridfs/find-one-as-map query))))
|
||||
md5 (:_id stored)
|
||||
md5 fname
|
||||
md5 {:md5 md5}))))
|
||||
|
||||
(deftest ^{:gridfs true} test-finding-multiple-files-on-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store-file (make-input-file input)
|
||||
(filename "monger.test.gridfs.file6")
|
||||
(content-type ct))
|
||||
stored2 (store-file (make-input-file input)
|
||||
(filename "monger.test.gridfs.file7")
|
||||
(content-type ct))
|
||||
list1 (gridfs/find "monger.test.gridfs.file6")
|
||||
list2 (gridfs/find "monger.test.gridfs.file7")
|
||||
list3 (gridfs/find "888000___.monger.test.gridfs.file")
|
||||
list4 (gridfs/find { :md5 md5 })]
|
||||
(is (= 2 (count (gridfs/all-files))))
|
||||
(are [a b] (is (= (map #(.get ^GridFSDBFile % "_id") a)
|
||||
(map :_id b)))
|
||||
list1 [stored1]
|
||||
list2 [stored2]
|
||||
list3 []
|
||||
list4 [stored1 stored2])))
|
||||
|
||||
|
||||
(deftest ^{:gridfs true} test-removing-multiple-files-from-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file8")
|
||||
(content-type ct))
|
||||
stored2 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file9")
|
||||
(content-type ct))]
|
||||
(is (= 2 (count (gridfs/all-files fs))))
|
||||
(gridfs/remove fs { :filename "monger.test.gridfs.file8" })
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(gridfs/remove fs { :md5 md5 })
|
||||
(is (= 0 (count (gridfs/all-files fs)))))))
|
||||
(deftest ^{:gridfs true} test-removing-multiple-files-from-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store-file (make-input-file input)
|
||||
(filename "monger.test.gridfs.file8")
|
||||
(content-type ct))
|
||||
stored2 (store-file (make-input-file input)
|
||||
(filename "monger.test.gridfs.file9")
|
||||
(content-type ct))]
|
||||
(is (= 2 (count (gridfs/all-files))))
|
||||
(gridfs/remove { :filename "monger.test.gridfs.file8" })
|
||||
(is (= 1 (count (gridfs/all-files))))
|
||||
(gridfs/remove { :md5 md5 })
|
||||
(is (= 0 (count (gridfs/all-files))))))
|
||||
|
|
|
|||
17
test/monger/test/helper.clj
Normal file
17
test/monger/test/helper.clj
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
(ns monger.test.helper
|
||||
(:require [monger core util])
|
||||
(:import [com.mongodb WriteConcern]))
|
||||
|
||||
(def connected (atom false))
|
||||
(defn connected?
|
||||
[]
|
||||
@connected)
|
||||
|
||||
(defn connect!
|
||||
[]
|
||||
(when-not (connected?)
|
||||
(do
|
||||
(monger.core/connect!)
|
||||
(monger.core/set-db! (monger.core/get-db "monger-test"))
|
||||
(monger.core/set-default-write-concern! WriteConcern/SAFE)
|
||||
(reset! connected true))))
|
||||
|
|
@ -1,49 +1,53 @@
|
|||
(ns monger.test.indexing-test
|
||||
(:import org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mc]
|
||||
monger.joda-time
|
||||
[clojure.test :refer :all]
|
||||
[clj-time.core :refer [now seconds ago from-now]]))
|
||||
[monger.test.helper :as helper]
|
||||
monger.joda-time)
|
||||
(:use clojure.test
|
||||
monger.test.fixtures
|
||||
[clj-time.core :only [now secs ago from-now]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest ^{:indexing true} test-creating-and-dropping-indexes
|
||||
(let [collection "libraries"]
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/create-index db collection {"language" 1})
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on db collection)))))
|
||||
(mc/drop-indexes db collection)
|
||||
(is (nil? (second (mc/indexes-on db collection))))
|
||||
(mc/ensure-index db collection (array-map "language" 1) {:unique true})
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on db collection)))))
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/ensure-index db collection (array-map "language" 1))
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/ensure-index db collection (array-map "language" 1) {:unique true})
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/ensure-index db collection (array-map "language" 1) "index-name" true)
|
||||
(mc/drop-indexes db collection)))
|
||||
(helper/connect!)
|
||||
|
||||
(deftest ^{:indexing true :time-consuming true} test-ttl-collections
|
||||
(let [coll "recent_events"
|
||||
ttl 15
|
||||
sleep 65]
|
||||
(mc/remove db coll)
|
||||
(mc/drop-indexes db coll)
|
||||
(mc/ensure-index db coll (array-map :created-at 1) {:expireAfterSeconds ttl})
|
||||
(dotimes [i 100]
|
||||
(mc/insert db coll {:type "signup" :created-at (-> i seconds ago) :i i}))
|
||||
(dotimes [i 100]
|
||||
(mc/insert db coll {:type "signup" :created-at (-> i seconds from-now) :i i}))
|
||||
(is (= 200 (mc/count db coll {:type "signup"})))
|
||||
;; sleep for > 60 seconds. MongoDB seems to run TTLMonitor once per minute, according to
|
||||
;; the log.
|
||||
(println (format "Now sleeping for %d seconds to test TTL collections!" sleep))
|
||||
(Thread/sleep (* sleep 1000))
|
||||
(println (format "Documents in the TTL collection: %d" (mc/count db coll {:type "signup"})))
|
||||
(is (< (mc/count db coll {:type "signup"}) 100))
|
||||
(mc/remove db coll))))
|
||||
|
||||
;;
|
||||
;; indexes
|
||||
;;
|
||||
|
||||
(deftest ^{:indexing true} test-creating-and-dropping-indexes
|
||||
(let [collection "libraries"]
|
||||
(mc/drop-indexes collection)
|
||||
(mc/create-index collection { "language" 1 })
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on collection)))))
|
||||
(mc/drop-index collection "language_1")
|
||||
(mc/create-index collection ["language"])
|
||||
(mc/drop-index collection "language_1")
|
||||
(is (nil? (second (mc/indexes-on collection))))
|
||||
(mc/ensure-index collection (array-map "language" 1) {:unique true})
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on collection)))))
|
||||
(mc/ensure-index collection (array-map "language" 1))
|
||||
(mc/ensure-index collection (array-map "language" 1) { :unique true })
|
||||
(mc/drop-indexes collection)))
|
||||
|
||||
(deftest ^{:indexing true :edge-features true :time-consuming true} test-ttl-collections
|
||||
(let [coll "recent_events"
|
||||
ttl 30
|
||||
sleep 120]
|
||||
(mc/remove coll)
|
||||
(mc/ensure-index coll (array-map :created-at 1) {:expireAfterSeconds ttl})
|
||||
(dotimes [i 100]
|
||||
(mc/insert coll {:type "signup" :created-at (-> i secs ago) :i i}))
|
||||
(dotimes [i 100]
|
||||
(mc/insert coll {:type "signup" :created-at (-> i secs from-now) :i i}))
|
||||
(is (= 200 (mc/count coll {:type "signup"})))
|
||||
;; sleep for 65 seconds. MongoDB 2.1.2 seems to run TTLMonitor once per minute, according to
|
||||
;; the log. MK.
|
||||
(println (format "Now sleeping for %d seconds to test TTL collections!" sleep))
|
||||
(Thread/sleep (* sleep 1000))
|
||||
(println (format "Documents in the TTL collection: %d" (mc/count coll {:type "signup"})))
|
||||
(is (< (mc/count coll {:type "signup"}) 100))
|
||||
(mc/remove coll)))
|
||||
|
|
|
|||
|
|
@ -1,3 +1,5 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.inserting-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject DBRef]
|
||||
org.bson.types.ObjectId
|
||||
|
|
@ -5,176 +7,165 @@
|
|||
(:require [monger.core :as mg]
|
||||
[monger.util :as mu]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer :all]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.conversion
|
||||
monger.test.fixtures))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
;;
|
||||
;; insert
|
||||
;;
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (monger.result/ok? (mc/insert "people" doc)))
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-with-explicitly-passed-database-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(dotimes [n 5]
|
||||
(is (monger.result/ok? (mc/insert monger.core/*mongodb-database* "people" doc WriteConcern/SAFE))))
|
||||
(is (= 5 (mc/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (monger.result/ok? (mc/insert "people" doc WriteConcern/SAFE)))
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-db-object-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (.get ^DBObject doc "_id")))
|
||||
(mc/insert "people" doc)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
(deftest insert-a-map-with-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
id (ObjectId.)
|
||||
doc {:name "Joe" :age 30 "_id" id}
|
||||
result (mc/insert "people" doc)]
|
||||
(is (= id (monger.util/get-id doc)))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-ratio-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:ratio 11/2 "_id" id}
|
||||
result (mc/insert "widgets" doc)]
|
||||
(is (= 5.5 (:ratio (mc/find-map-by-id collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword :kwd "_id" id}
|
||||
result (mc/insert "widgets" doc)]
|
||||
(is (= (name :kwd) (:keyword (mc/find-map-by-id collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-a-set-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword1 {:keyword2 #{:kw1 :kw2}} "_id" id}
|
||||
result (mc/insert "widgets" doc)]
|
||||
(is (= (sort ["kw1" "kw2"])
|
||||
(sort (get-in (mc/find-map-by-id collection id) [:keyword1 :keyword2]))))))
|
||||
|
||||
|
||||
(defrecord Metrics
|
||||
[rps eps])
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "widgets")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "widgets"))
|
||||
(deftest insert-a-document-with-clojure-record-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:record (Metrics. 10 20) "_id" id}
|
||||
result (mc/insert "widgets" doc)]
|
||||
(is (= {:rps 10 :eps 20} (:record (mc/find-map-by-id collection id))))))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
(deftest test-insert-a-document-with-dbref
|
||||
(mc/remove "widgets")
|
||||
(mc/remove "owners")
|
||||
(let [coll1 "widgets"
|
||||
coll2 "owners"
|
||||
oid (ObjectId.)
|
||||
joe (mc/insert "owners" {:name "Joe" :_id oid})
|
||||
dbref (DBRef. (mg/current-db) coll2 oid)]
|
||||
(mc/insert coll1 {:type "pentagon" :owner dbref})
|
||||
(let [fetched (mc/find-one-as-map coll1 {:type "pentagon"})
|
||||
fo (:owner fetched)]
|
||||
(is (= {:_id oid :name "Joe"} (from-db-object @fo true))))))
|
||||
|
||||
|
||||
;;
|
||||
;; insert
|
||||
;;
|
||||
;;
|
||||
;; insert-and-return
|
||||
;;
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (mc/insert db collection doc))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
(deftest insert-and-return-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}
|
||||
result (mc/insert-and-return :people doc)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-with-explicitly-passed-database-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(dotimes [n 5]
|
||||
(mc/insert db collection doc WriteConcern/SAFE))
|
||||
(is (= 5 (mc/count db collection)))))
|
||||
(deftest insert-and-return-a-basic-document-without-id-but-with-a-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30 :ratio 3/4}
|
||||
result (mc/insert-and-return "people" doc WriteConcern/FSYNC_SAFE)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (= (:ratio doc)
|
||||
(:ratio result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (mc/insert db collection doc WriteConcern/SAFE))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-basic-db-object-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (.get ^DBObject doc "_id")))
|
||||
(mc/insert db collection doc)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
(deftest insert-a-map-with-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
id (ObjectId.)
|
||||
doc {:name "Joe" :age 30 "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= id (monger.util/get-id doc)))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-ratio-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:ratio 11/2 "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= 5.5 (:ratio (mc/find-map-by-id db collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword :kwd "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= (name :kwd) (:keyword (mc/find-map-by-id db collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-a-set-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword1 {:keyword2 #{:kw1 :kw2}} "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= (sort ["kw1" "kw2"])
|
||||
(sort (get-in (mc/find-map-by-id db collection id) [:keyword1 :keyword2]))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-record-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:record (Metrics. 10 20) "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= {:rps 10 :eps 20} (:record (mc/find-map-by-id db collection id))))))
|
||||
|
||||
;; TODO: disabled until we figure out how to implement dereferencing of DBRefs
|
||||
;; in 3.0 in a compatible way (and if that's possible at all). MK.
|
||||
#_ (deftest test-insert-a-document-with-dbref
|
||||
(mc/remove db "widgets")
|
||||
(mc/remove db "owners")
|
||||
(let [coll1 "widgets"
|
||||
coll2 "owners"
|
||||
oid (ObjectId.)
|
||||
joe (mc/insert db coll2 {:name "Joe" :_id oid})
|
||||
dbref (DBRef. coll2 oid)]
|
||||
(mc/insert db coll1 {:type "pentagon" :owner dbref})
|
||||
(let [fetched (mc/find-one-as-map db coll1 {:type "pentagon"})
|
||||
fo (:owner fetched)]
|
||||
(is (= {:_id oid :name "Joe"} (from-db-object @fo true))))))
|
||||
(deftest insert-and-return-with-a-provided-id
|
||||
(let [collection "people"
|
||||
oid (ObjectId.)
|
||||
doc {:name "Joe" :age 30 :_id oid}
|
||||
result (mc/insert-and-return :people doc)]
|
||||
(is (= (:_id result) (:_id doc) oid))
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
|
||||
;;
|
||||
;; insert-and-return
|
||||
;;
|
||||
;;
|
||||
;; insert-batch
|
||||
;;
|
||||
|
||||
(deftest insert-and-return-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}
|
||||
result (mc/insert-and-return db collection doc)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (monger.result/ok? (mc/insert-batch "people" docs)))
|
||||
(is (= 2 (mc/count collection)))))
|
||||
|
||||
(deftest insert-and-return-a-basic-document-without-id-but-with-a-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30 :ratio 3/4}
|
||||
result (mc/insert-and-return db collection doc WriteConcern/FSYNC_SAFE)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (= (:ratio doc)
|
||||
(:ratio result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (monger.result/ok? (mc/insert-batch "people" docs WriteConcern/NORMAL)))
|
||||
(is (= 2 (mc/count collection)))))
|
||||
|
||||
(deftest insert-and-return-with-a-provided-id
|
||||
(let [collection "people"
|
||||
oid (ObjectId.)
|
||||
doc {:name "Joe" :age 30 :_id oid}
|
||||
result (mc/insert-and-return db collection doc)]
|
||||
(is (= (:_id result) (:_id doc) oid))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
(deftest insert-a-batch-of-basic-documents-with-explicit-database-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(dotimes [n 44]
|
||||
(is (monger.result/ok? (mc/insert-batch monger.core/*mongodb-database* "people" docs WriteConcern/NORMAL))))
|
||||
(is (= 88 (mc/count collection)))))
|
||||
|
||||
|
||||
;;
|
||||
;; insert-batch
|
||||
;;
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (mc/insert-batch db collection docs))
|
||||
(is (= 2 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (mc/insert-batch db collection docs WriteConcern/FSYNCED))
|
||||
(is (= 2 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-with-explicit-database-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(dotimes [n 44]
|
||||
(is (mc/insert-batch db collection docs WriteConcern/FSYNCED)))
|
||||
(is (= 88 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-from-a-lazy-sequence
|
||||
(let [collection "people"
|
||||
numbers (range 0 1000)]
|
||||
(is (mc/insert-batch db collection (map (fn [^long l]
|
||||
{:n l})
|
||||
numbers)))
|
||||
(is (= (count numbers) (mc/count db collection))))))
|
||||
(deftest insert-a-batch-of-basic-documents-from-a-lazy-sequence
|
||||
(let [collection "people"
|
||||
numbers (range 0 1000)]
|
||||
(is (monger.result/ok? (mc/insert-batch "people" (map (fn [^long l]
|
||||
{:n l})
|
||||
numbers))))
|
||||
(is (= (count numbers) (mc/count collection)))))
|
||||
|
|
|
|||
45
test/monger/test/internal/fn_test.clj
Normal file
45
test/monger/test/internal/fn_test.clj
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
(ns monger.test.internal.fn-test
|
||||
(:use clojure.test
|
||||
monger.internal.fn))
|
||||
|
||||
|
||||
(deftest test-expand-all
|
||||
(are [i o] (is (= (expand-all i) o))
|
||||
{ :int (fn [] 1) :str "Clojure" :float (Float/valueOf 11.0) } { :int 1 :str "Clojure" :float (Float/valueOf 11.0 )}
|
||||
{ :long (fn [] (Long/valueOf 11)) } { :long (Long/valueOf 11) }
|
||||
{
|
||||
:i 1
|
||||
:l (Long/valueOf 1111)
|
||||
:s "Clojure"
|
||||
:d (Double/valueOf 11.1)
|
||||
:f (Float/valueOf 2.5)
|
||||
:v [1 2 3]
|
||||
:dyn-v [(fn [] 10) (fn [] 20) (fn [] 30)]
|
||||
:dyn-i (fn [] 1)
|
||||
:dyn-s (fn [] "Clojure (expanded)")
|
||||
:m { :nested "String" }
|
||||
:dyn-m { :abc (fn [] :abc) :nested { :a { :b { :c (fn [] "d") } } } }
|
||||
}
|
||||
{
|
||||
:i 1
|
||||
:l (Long/valueOf 1111)
|
||||
:s "Clojure"
|
||||
:d (Double/valueOf 11.1)
|
||||
:f (Float/valueOf 2.5)
|
||||
:v [1 2 3]
|
||||
:dyn-v [10 20 30]
|
||||
:dyn-i 1
|
||||
:dyn-s "Clojure (expanded)"
|
||||
:m { :nested "String" }
|
||||
:dyn-m {
|
||||
:abc :abc
|
||||
:nested { :a { :b { :c "d" } } }
|
||||
}
|
||||
}))
|
||||
|
||||
(deftest test-expand-all-with
|
||||
(let [expander-fn (fn [f]
|
||||
(* 3 (f)))]
|
||||
(are [i o] (is (= (expand-all-with i expander-fn) o))
|
||||
{ :a 1 :int (fn [] 3) } { :a 1 :int 9 }
|
||||
{ :v [(fn [] 1) (fn [] 11)] :m { :inner (fn [] 3) } :s "Clojure" } { :v [3 33] :m { :inner 9 } :s "Clojure" })))
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
(ns monger.test.internal.pagination-test
|
||||
(:require [clojure.test :refer :all]
|
||||
[monger.internal.pagination :refer :all]))
|
||||
(:use clojure.test
|
||||
monger.internal.pagination))
|
||||
|
||||
(deftest test-pagination-offset
|
||||
(are [a b] (= a b)
|
||||
|
|
|
|||
|
|
@ -1,6 +1,9 @@
|
|||
(ns monger.test.js-test
|
||||
(:require monger.js
|
||||
[clojure.test :refer :all]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(deftest load-js-resource-using-path-on-the-classpath
|
||||
(are [c path] (= c (count (monger.js/load-resource path)))
|
||||
|
|
|
|||
|
|
@ -1,16 +0,0 @@
|
|||
(ns monger.test.json-cheshire-test
|
||||
(:require [clojure.test :refer :all]
|
||||
[monger.json]
|
||||
[cheshire.core :refer :all])
|
||||
(:import org.bson.types.ObjectId
|
||||
org.bson.types.BSONTimestamp))
|
||||
|
||||
(deftest convert-dbobject-to-json
|
||||
(let [input (ObjectId.)
|
||||
output (generate-string input)]
|
||||
(is (= (str "\"" input "\"") output))))
|
||||
|
||||
(deftest convert-bson-timestamp-to-json
|
||||
(let [input (BSONTimestamp. 123 4)
|
||||
output (generate-string input)]
|
||||
(is (= "{\"time\":123,\"inc\":4}" output))))
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
(ns monger.test.json-test
|
||||
(:require [clojure.test :refer :all]
|
||||
[monger.json]
|
||||
[clojure.data.json :as json])
|
||||
(:import org.bson.types.ObjectId
|
||||
org.bson.types.BSONTimestamp))
|
||||
|
||||
(deftest convert-dbobject-to-json
|
||||
(let [input (ObjectId.)
|
||||
output (json/write-str input)]
|
||||
(is (= (str "\"" input "\"") output))))
|
||||
|
||||
(deftest convert-bson-timestamp-to-json
|
||||
(let [input (BSONTimestamp. 123 4)
|
||||
output (json/write-str input)]
|
||||
(is (= "{\"time\":123,\"inc\":4}" output))))
|
||||
|
|
@ -1,13 +1,13 @@
|
|||
(ns monger.test.lib-integration-test
|
||||
(:import [org.joda.time DateTime DateMidnight LocalDate]
|
||||
(:use clojure.test
|
||||
monger.conversion)
|
||||
(:import [org.joda.time DateTime DateMidnight]
|
||||
org.bson.types.ObjectId
|
||||
com.mongodb.DBObject)
|
||||
(:require monger.json
|
||||
monger.joda-time
|
||||
[clj-time.core :as t]
|
||||
[cheshire.core :as json]
|
||||
[clojure.test :refer :all]
|
||||
[monger.conversion :refer :all]))
|
||||
[cheshire.core :as json]))
|
||||
|
||||
|
||||
(deftest ^{:integration true} serialization-of-joda-datetime-to-json
|
||||
|
|
@ -31,12 +31,6 @@
|
|||
(is (instance? java.util.Date d))
|
||||
(is (= 1318464000000 (.getTime ^java.util.Date d)))))
|
||||
|
||||
(deftest ^{:integration true} conversion-of-joda-localdate-to-db-object
|
||||
(let [d (to-db-object (LocalDate. 2011 10 13))]
|
||||
(is (instance? java.util.Date d))
|
||||
(is (= 111 (.getYear ^java.util.Date d))) ;; how many years since 1900
|
||||
(is (= 9 (.getMonth ^java.util.Date d))) ;; java.util.Date counts from 0
|
||||
(is (= 13 (.getDate ^java.util.Date d)))))
|
||||
|
||||
(deftest ^{:integration true} conversion-of-java-util-date-to-joda-datetime
|
||||
(let [input (.toDate ^DateTime (t/date-time 2011 10 13 23 55 0))
|
||||
|
|
@ -44,12 +38,8 @@
|
|||
(is (instance? org.joda.time.DateTime output))
|
||||
(is (= input (.toDate ^DateTime output)))))
|
||||
|
||||
|
||||
(deftest ^{:integration true} test-reader-extensions
|
||||
(let [^DateTime d (t/date-time 2011 10 13 23 55 0)]
|
||||
(binding [*print-dup* true]
|
||||
(pr-str d))))
|
||||
|
||||
(deftest ^{:integration true} test-reader-extensions-for-localdate
|
||||
(let [^DateTime d (t/today)]
|
||||
(binding [*print-dup* true]
|
||||
(pr-str d))))
|
||||
69
test/monger/test/map_reduce_test.clj
Normal file
69
test/monger/test/map_reduce_test.clj
Normal file
|
|
@ -0,0 +1,69 @@
|
|||
(ns monger.test.map-reduce-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject MapReduceOutput MapReduceCommand MapReduceCommand$OutputType]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mc]
|
||||
[monger.result :as mgres]
|
||||
[clojurewerkz.support.js :as js]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
[monger operators conversion]
|
||||
monger.test.fixtures))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
;;
|
||||
;; Map/Reduce
|
||||
;;
|
||||
|
||||
(let [collection "widgets"
|
||||
mapper (js/load-resource "resources/mongo/js/mapfun1.js")
|
||||
reducer "function(key, values) {
|
||||
var result = 0;
|
||||
values.forEach(function(v) { result += v });
|
||||
|
||||
return result;
|
||||
}"
|
||||
batch [{ :state "CA" :quantity 1 :price 199.00 }
|
||||
{ :state "NY" :quantity 2 :price 199.00 }
|
||||
{ :state "NY" :quantity 1 :price 299.00 }
|
||||
{ :state "IL" :quantity 2 :price 11.50 }
|
||||
{ :state "CA" :quantity 2 :price 2.95 }
|
||||
{ :state "IL" :quantity 3 :price 5.50 }]
|
||||
expected [{:_id "CA", :value 204.9} {:_id "IL", :value 39.5} {:_id "NY", :value 697.0}]]
|
||||
(deftest test-basic-inline-map-reduce-example
|
||||
(mc/remove monger.core/*mongodb-database* collection {})
|
||||
(is (mgres/ok? (mc/insert-batch collection batch)))
|
||||
(let [output (mc/map-reduce collection mapper reducer nil MapReduceCommand$OutputType/INLINE {})
|
||||
results (from-db-object ^DBObject (.results ^MapReduceOutput output) true)]
|
||||
(mgres/ok? output)
|
||||
(is (= expected results))))
|
||||
|
||||
(deftest test-basic-map-reduce-example-that-replaces-named-collection
|
||||
(mc/remove monger.core/*mongodb-database* collection {})
|
||||
(is (mgres/ok? (mc/insert-batch collection batch)))
|
||||
(let [output (mc/map-reduce collection mapper reducer "mr_outputs" {})
|
||||
results (from-db-object ^DBObject (.results ^MapReduceOutput output) true)]
|
||||
(mgres/ok? output)
|
||||
(is (= 3 (monger.core/count results)))
|
||||
(is (= expected
|
||||
(map #(from-db-object % true) (seq results))))
|
||||
(is (= expected
|
||||
(map #(from-db-object % true) (mc/find "mr_outputs"))))
|
||||
(.drop ^MapReduceOutput output)))
|
||||
|
||||
(deftest test-basic-map-reduce-example-that-merged-results-into-named-collection
|
||||
(mc/remove monger.core/*mongodb-database* collection {})
|
||||
(is (mgres/ok? (mc/insert-batch collection batch)))
|
||||
(mc/map-reduce collection mapper reducer "merged_mr_outputs" MapReduceCommand$OutputType/MERGE {})
|
||||
(is (mgres/ok? (mc/insert collection { :state "OR" :price 17.95 :quantity 4 })))
|
||||
(let [^MapReduceOutput output (mc/map-reduce collection mapper reducer "merged_mr_outputs" MapReduceCommand$OutputType/MERGE {})]
|
||||
(mgres/ok? output)
|
||||
(is (= 4 (monger.core/count output)))
|
||||
(is (= ["CA" "IL" "NY" "OR"]
|
||||
(map :_id (mc/find-maps "merged_mr_outputs"))))
|
||||
(.drop ^MapReduceOutput output))))
|
||||
407
test/monger/test/multi/atomic_modifiers_test.clj
Normal file
407
test/monger/test/multi/atomic_modifiers_test.clj
Normal file
|
|
@ -0,0 +1,407 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.multi.atomic-modifiers-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger.core :as mg]
|
||||
[monger core util]
|
||||
[monger.multi.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(defn drop-altdb
|
||||
[f]
|
||||
(mg/drop-db "altdb")
|
||||
(f))
|
||||
|
||||
(use-fixtures :each drop-altdb)
|
||||
|
||||
;;
|
||||
;; $inc
|
||||
;;
|
||||
|
||||
(deftest increment-a-single-existing-field-using-$inc-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :username "l33r0y" :score 100 })
|
||||
(mgcol/update db coll { :_id oid } { $inc { :score 20 } })
|
||||
(is (= 120 (:score (mgcol/find-map-by-id db coll oid))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$inc-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :username "l33r0y" })
|
||||
(mgcol/update db coll { :_id oid } { $inc { :score 30 } })
|
||||
(is (= 30 (:score (mgcol/find-map-by-id db coll oid))))))
|
||||
|
||||
|
||||
(deftest increment-multiple-existing-fields-using-$inc-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :username "l33r0y" :score 100 :bonus 0 })
|
||||
(mgcol/update db coll { :_id oid } {$inc { :score 20 :bonus 10 } })
|
||||
(is (= { :_id oid :score 120 :bonus 10 :username "l33r0y" } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest increment-and-set-multiple-existing-fields-using-$inc-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :username "l33r0y" :score 100 })
|
||||
(mgcol/update db coll { :_id oid } { $inc { :score 20 :bonus 10 } })
|
||||
(is (= { :_id oid :score 120 :bonus 10 :username "l33r0y" } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; $set
|
||||
;;
|
||||
|
||||
(deftest update-a-single-existing-field-using-$set-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update db coll { :_id oid } { $set { :weight 20.5 } })
|
||||
(is (= 20.5 (:weight (mgcol/find-map-by-id db coll oid [:weight]))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$set-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update db coll { :_id oid } { $set { :height 17.2 } })
|
||||
(is (= 17.2 (:height (mgcol/find-map-by-id db coll oid [:height]))))))
|
||||
|
||||
(deftest update-multiple-existing-fields-using-$set-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :weight 10.0 :height 15.2 })
|
||||
(mgcol/update db coll { :_id oid } { $set { :weight 20.5 :height 25.6 } })
|
||||
(is (= { :_id oid :weight 20.5 :height 25.6 } (mgcol/find-map-by-id db coll oid [:weight :height])))))
|
||||
|
||||
|
||||
(deftest update-and-set-multiple-fields-using-$set-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update db coll { :_id oid } {$set { :weight 20.5 :height 25.6 } })
|
||||
(is (= { :_id oid :weight 20.5 :height 25.6 } (mgcol/find-map-by-id db coll oid [:weight :height])))))
|
||||
|
||||
|
||||
;;
|
||||
;; $unset
|
||||
;;
|
||||
|
||||
(deftest unset-a-single-existing-field-using-$unset-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :title "Document 1" :published true })
|
||||
(mgcol/update db coll { :_id oid } { $unset { :published 1 } })
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest unset-multiple-existing-fields-using-$unset-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :title "Document 1" :published true :featured true })
|
||||
(mgcol/update db coll { :_id oid } { $unset { :published 1 :featured true } })
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest unsetting-an-unexisting-field-using-$unset-modifier-is-not-considered-an-issue
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :title "Document 1" :published true })
|
||||
(is (mgres/ok? (mgcol/update db coll { :_id oid } { $unset { :published 1 :featured true } })))
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $setOnInsert
|
||||
;;
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-non-existing-document
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mgcol/find-and-modify db coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} :upsert true)
|
||||
(is (= { :_id oid :lastseen now :firstseen now} (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-existing-document
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
before 123
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert db coll { :_id oid :firstseen before :lastseen before})
|
||||
(mgcol/find-and-modify db coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} :upsert true)
|
||||
(is (= { :_id oid :lastseen now :firstseen before} (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert db coll { :_id oid :title title })
|
||||
(mgcol/update db coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["modifiers"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update db coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;; this is a common mistake, I leave it here to demonstrate it. You almost never
|
||||
;; actually want to do this! What you really want is to use $pushAll instead of $push. MK.
|
||||
(deftest add-array-value-to-an-existing-array-using-$push-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update db coll { :_id oid } { $push { :tags ["modifiers" "operators"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" ["modifiers" "operators"]] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update db coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(mgcol/update db coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers" "modifiers"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $pushAll
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$pushAll-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert db coll { :_id oid :title title })
|
||||
(mgcol/update db coll { :_id oid } { $pushAll { :tags ["mongodb" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "docs"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$pushAll-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update db coll { :_id oid } { $pushAll { :tags ["modifiers" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers" "docs"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$pushAll-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["mongodb" "docs"] })
|
||||
(mgcol/update db coll { :_id oid } { $pushAll { :tags ["modifiers" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "docs" "modifiers" "docs"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $addToSet
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$addToSet-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert db coll { :_id oid :title title })
|
||||
(mgcol/update db coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["modifiers"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update db coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update db coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(mgcol/update db coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pop
|
||||
;;
|
||||
|
||||
(deftest pop-last-value-in-the-array-using-$pop-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["products" "apple" "reviews"] })
|
||||
(mgcol/update db coll { :_id oid } { $pop { :tags 1 } })
|
||||
(is (= { :_id oid :title title :tags ["products" "apple"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest unshift-first-value-in-the-array-using-$pop-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["products" "apple" "reviews"] })
|
||||
(mgcol/update db coll { :_id oid } { $pop { :tags -1 } })
|
||||
(is (= { :_id oid :title title :tags ["apple" "reviews"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest pop-last-values-from-multiple-arrays-using-$pop-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert db coll { :_id oid :title title :tags ["products" "apple" "reviews"] :categories ["apple" "reviews" "drafts"] })
|
||||
(mgcol/update db coll { :_id oid } { $pop { :tags 1 :categories 1 } })
|
||||
(is (= { :_id oid :title title :tags ["products" "apple"] :categories ["apple" "reviews"] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pull
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mgcol/insert db coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update db coll { :_id oid } { $pull { :measurements 1.2 } })
|
||||
(is (= { :_id oid :title title :measurements [1.0 1.1 1.1 1.3 1.0] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier-based-on-a-condition
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mgcol/insert db coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update db coll { :_id oid } { $pull { :measurements { $gte 1.2 } } })
|
||||
(is (= { :_id oid :title title :measurements [1.0 1.1 1.1 1.0] } (mgcol/find-map-by-id db coll oid)))))
|
||||
;;
|
||||
;; $pullAll
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pullAll-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pullAll modifier removes entries of multiple values in the array"]
|
||||
(mgcol/insert db coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update db coll { :_id oid } { $pullAll { :measurements [1.0 1.1 1.2] } })
|
||||
(is (= { :_id oid :title title :measurements [1.3] } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $rename
|
||||
;;
|
||||
|
||||
(deftest rename-a-single-field-using-$rename-modifier
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$rename renames fields"
|
||||
v [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]]
|
||||
(mgcol/insert db coll { :_id oid :title title :measurements v })
|
||||
(mgcol/update db coll { :_id oid } { $rename { :measurements "results" } })
|
||||
(is (= { :_id oid :title title :results v } (mgcol/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; find-and-modify
|
||||
;;
|
||||
|
||||
(deftest find-and-modify-a-single-document
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}
|
||||
update {$inc {:level 1}}]
|
||||
(mgcol/insert db coll doc)
|
||||
(let [res (mgcol/find-and-modify db coll conditions update :return-new true)]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 43})))))
|
||||
|
||||
|
||||
(deftest find-and-modify-remove-a-document
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}]
|
||||
(mgcol/insert db coll doc)
|
||||
(let [res (mgcol/find-and-modify db coll conditions {} :remove true)]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42}))
|
||||
(is (empty? (mgcol/find-maps db coll conditions))))))
|
||||
|
||||
|
||||
(deftest find-and-modify-upsert-a-document
|
||||
(testing "case 1"
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}]
|
||||
(let [res (mgcol/find-and-modify db coll doc doc :upsert true)]
|
||||
(is (empty? res))
|
||||
(is (select-keys (mgcol/find-map-by-id db coll oid) [:name :level]) (dissoc doc :_id)))))
|
||||
(testing "case 2"
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
query {:name "Sophie Bangs"}
|
||||
doc (merge query {:level 42})]
|
||||
(let [res (mgcol/find-and-modify db coll query doc :upsert true :return-new true)]
|
||||
(is (:_id res))
|
||||
(is (select-keys (mgcol/find-map-by-id db coll (:_id res)) [:name :level]) doc)))))
|
||||
|
||||
|
||||
(deftest find-and-modify-after-sort
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "docs"
|
||||
oid (ObjectId.)
|
||||
oid2 (ObjectId.)
|
||||
doc {:name "Sophie Bangs"}
|
||||
doc1 (assoc doc :_id oid :level 42)
|
||||
doc2 (assoc doc :_id oid2 :level 0)]
|
||||
(mgcol/insert-batch db coll [doc1 doc2])
|
||||
(let [res (mgcol/find-and-modify db coll doc {$inc {:level 1}} :sort {:level -1})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42})))))
|
||||
132
test/monger/test/multi/collection_test.clj
Normal file
132
test/monger/test/multi/collection_test.clj
Normal file
|
|
@ -0,0 +1,132 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.multi.collection-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject MapReduceOutput MapReduceCommand MapReduceCommand$OutputType]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.multi.collection :as mc]
|
||||
[monger.result :as mgres]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.test.fixtures))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(defn drop-altdb
|
||||
[f]
|
||||
(mg/drop-db "altdb")
|
||||
(f))
|
||||
|
||||
(use-fixtures :each drop-altdb)
|
||||
|
||||
(deftest get-collection-size
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "things"]
|
||||
(is (= 0 (mc/count db collection)))
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(is (mc/any? db collection))
|
||||
(is (= 3 (mc/count db collection {:language "Clojure"})))
|
||||
(is (mc/any? db collection {:language "Clojure"}))
|
||||
(is (= 1 (mc/count db collection {:language "Scala" })))
|
||||
(is (mc/any? db collection {:language "Scala"}))
|
||||
(is (= 0 (mc/count db collection {:language "Python" })))
|
||||
(is (not (mc/any? db collection {:language "Python"})))))
|
||||
|
||||
(deftest remove-all-documents-from-collection
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(mc/remove db collection)
|
||||
(is (= 0 (mc/count db collection)))))
|
||||
|
||||
(deftest remove-some-documents-from-collection
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(mc/remove db collection {:language "Clojure"})
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest remove-a-single-document-from-collection
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger" :_id oid}])
|
||||
(mc/remove-by-id db collection oid)
|
||||
(is (= 0 (mc/count db collection)))
|
||||
(is (nil? (mc/find-by-id db collection oid)))))
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-not-exist
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "widgets"]
|
||||
(mc/drop db collection)
|
||||
(is (false? (mc/exists? db collection)))))
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-exist
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "widgets"]
|
||||
(mc/drop db collection)
|
||||
(mc/insert-batch db collection [{:name "widget1"}
|
||||
{:name "widget2"}])
|
||||
(is (mc/exists? db collection))
|
||||
(mc/drop db collection)
|
||||
(is (false? (mc/exists? db collection)))
|
||||
(mc/create db "widgets" {:capped true :size 100000 :max 10})
|
||||
(is (mc/exists? db collection))
|
||||
(mc/rename db collection "gadgets")
|
||||
(is (not (mc/exists? db collection)))
|
||||
(is (mc/exists? db "gadgets"))
|
||||
(mc/drop db "gadgets")))
|
||||
|
||||
(deftest test-any-on-empty-collection
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "things"]
|
||||
(is (not (mc/any? db collection)))))
|
||||
|
||||
(deftest test-any-on-non-empty-collection
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "things"
|
||||
_ (mc/insert db collection {:language "Clojure" :name "langohr"})]
|
||||
(is (mc/any? db "things"))
|
||||
(is (mc/any? db "things" {:language "Clojure"}))))
|
||||
|
||||
(deftest test-empty-on-empty-collection
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "things"]
|
||||
(is (mc/empty? db collection))
|
||||
(is (mc/empty? db collection))))
|
||||
|
||||
(deftest test-empty-on-non-empty-collection
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "things"
|
||||
_ (mc/insert db collection {:language "Clojure" :name "langohr"})]
|
||||
(is (not (mc/empty? db "things")))))
|
||||
|
||||
(deftest test-distinct-values
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "widgets"
|
||||
batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]]
|
||||
(mc/insert-batch db collection batch)
|
||||
(is (= ["CA" "IL" "NY"] (sort (mc/distinct db collection :state {}))))
|
||||
(is (= ["CA" "NY"] (sort (mc/distinct db collection :state {:price {$gt 100.00}}))))))
|
||||
|
||||
|
||||
(run-tests)
|
||||
147
test/monger/test/multi/find_test.clj
Normal file
147
test/monger/test/multi/find_test.clj
Normal file
|
|
@ -0,0 +1,147 @@
|
|||
|
||||
(ns monger.test.multi.find-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject DBRef]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.util :as mu]
|
||||
[monger.multi.collection :as mgcol]
|
||||
[monger.test.helper :as helper]
|
||||
[monger.conversion :as mgcnv])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.conversion
|
||||
monger.test.fixtures))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(defn drop-altdb
|
||||
[f]
|
||||
(mg/drop-db "altdb")
|
||||
(f))
|
||||
|
||||
(use-fixtures :each drop-altdb)
|
||||
|
||||
;;
|
||||
;; find
|
||||
;;
|
||||
|
||||
(deftest find-full-document-when-collection-is-empty
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "docs"
|
||||
cursor (mgcol/find db collection)]
|
||||
(is (empty? (iterator-seq cursor)))))
|
||||
|
||||
(deftest find-document-seq-when-collection-is-empty
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "docs"]
|
||||
(is (empty? (mgcol/find-seq db collection)))))
|
||||
|
||||
(deftest find-multiple-documents-when-collection-is-empty
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(is (empty? (mgcol/find db collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-maps-when-collection-is-empty
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(is (empty? (mgcol/find-maps db collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-documents-by-regex
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Java", :name "nhibernate" }
|
||||
{ :language "JavaScript", :name "sprout-core" }])
|
||||
(is (= 2 (monger.core/count (mgcol/find db collection { :language #"Java*" }))))))
|
||||
|
||||
(deftest find-multiple-documents
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (monger.core/count (mgcol/find db collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mgcol/find db collection { :language "Clojure" }))))
|
||||
(is (empty? (mgcol/find db collection { :language "Java" })))))
|
||||
|
||||
|
||||
(deftest find-document-specify-fields
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"
|
||||
_ (mgcol/insert db collection { :language "Clojure", :name "monger" })
|
||||
result (mgcol/find db collection { :language "Clojure"} [:language])]
|
||||
(is (= (seq [:_id :language]) (keys (mgcnv/from-db-object (.next result) true))))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents-the-hard-way
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (map (fn [dbo]
|
||||
(mgcnv/from-db-object dbo true))
|
||||
(mgcol/find-seq db collection { :language "Clojure" })))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (mgcol/find-maps db collection { :language "Clojure" }))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
|
||||
(deftest find-multiple-maps
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (count (mgcol/find-maps db collection { :language "Scala" }))))
|
||||
(is (= 3 (count (mgcol/find-maps db collection { :language "Clojure" }))))
|
||||
(is (empty? (mgcol/find-maps db collection { :language "Java" })))
|
||||
(is (empty? (mgcol/find-maps db collection { :language "Java" } [:language :name])))))
|
||||
|
||||
|
||||
|
||||
(deftest find-multiple-partial-documents
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(let [scala-libs (mgcol/find db collection { :language "Scala" } [:name])
|
||||
clojure-libs (mgcol/find db collection { :language "Clojure"} [:language])]
|
||||
(is (= 1 (.count scala-libs)))
|
||||
(is (= 3 (.count clojure-libs)))
|
||||
(doseq [i clojure-libs]
|
||||
(let [doc (mgcnv/from-db-object i true)]
|
||||
(is (= (:language doc) "Clojure"))))
|
||||
(is (empty? (mgcol/find db collection { :language "Erlang" } [:name]))))))
|
||||
|
||||
(deftest finds-one-as-map
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }])
|
||||
(let [res (mgcol/find-one-as-map db collection { :name "langohr" })]
|
||||
(is (map? res))
|
||||
(is (= "langohr" (:name res)))
|
||||
(is (= "Clojure" (:language res))))
|
||||
(is (= 2 (count (mgcol/find-one-as-map db collection { :name "langohr" } [:name]))))
|
||||
(is (= "langohr" (get (mgcol/find-one-as-map db collection { :name "langohr" } [:name] false) "name")))))
|
||||
|
||||
(deftest find-and-modify
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mgcol/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }])))
|
||||
|
||||
57
test/monger/test/multi/indexing_test.clj
Normal file
57
test/monger/test/multi/indexing_test.clj
Normal file
|
|
@ -0,0 +1,57 @@
|
|||
(ns monger.test.multi.indexing-test
|
||||
(:import org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.multi.collection :as mc]
|
||||
[monger.test.helper :as helper]
|
||||
monger.joda-time)
|
||||
(:use clojure.test
|
||||
monger.test.fixtures
|
||||
[clj-time.core :only [now secs ago from-now]]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(defn drop-altdb
|
||||
[f]
|
||||
(mg/drop-db "altdb")
|
||||
(f))
|
||||
|
||||
(use-fixtures :each drop-altdb)
|
||||
|
||||
(deftest ^{:indexing true} test-creating-and-dropping-indexes
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/create-index db collection { "language" 1 })
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on db collection)))))
|
||||
(mc/drop-index db collection "language_1")
|
||||
(mc/create-index db collection ["language"])
|
||||
(mc/drop-index db collection "language_1")
|
||||
(is (nil? (second (mc/indexes-on db collection))))
|
||||
(mc/ensure-index db collection (array-map "language" 1) {:unique true})
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on db collection)))))
|
||||
(mc/ensure-index db collection (array-map "language" 1))
|
||||
(mc/ensure-index db collection (array-map "language" 1) { :unique true })
|
||||
(mc/drop-indexes db collection)))
|
||||
|
||||
(deftest ^{:indexing true :edge-features true :time-consuming true} test-ttl-collections
|
||||
(let [db (mg/get-db "altdb")
|
||||
coll "recent_events"
|
||||
ttl 30
|
||||
sleep 120]
|
||||
(mc/remove db coll)
|
||||
(mc/ensure-index db coll (array-map :created-at 1) {:expireAfterSeconds ttl})
|
||||
(dotimes [i 100]
|
||||
(mc/insert db coll {:type "signup" :created-at (-> i secs ago) :i i}))
|
||||
(dotimes [i 100]
|
||||
(mc/insert db coll {:type "signup" :created-at (-> i secs from-now) :i i}))
|
||||
(is (= 200 (mc/count db coll {:type "signup"})))
|
||||
;; sleep for 65 seconds. MongoDB 2.1.2 seems to run TTLMonitor once per minute, according to
|
||||
;; the log. MK.
|
||||
(println (format "Now sleeping for %d seconds to test TTL collections!" sleep))
|
||||
(Thread/sleep (* sleep 1000))
|
||||
(println (format "Documents in the TTL collection: %d" (mc/count db coll {:type "signup"})))
|
||||
(is (< (mc/count db coll {:type "signup"}) 100))
|
||||
(mc/remove db coll)))
|
||||
189
test/monger/test/multi/inserting_test.clj
Normal file
189
test/monger/test/multi/inserting_test.clj
Normal file
|
|
@ -0,0 +1,189 @@
|
|||
(ns monger.test.multi.inserting-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject DBRef]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.util :as mu]
|
||||
[monger.multi.collection :as mc]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.conversion
|
||||
monger.test.fixtures))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(defn drop-altdb
|
||||
[f]
|
||||
(mg/drop-db "altdb")
|
||||
(f))
|
||||
|
||||
(use-fixtures :each drop-altdb)
|
||||
|
||||
;;
|
||||
;; insert
|
||||
;;
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (monger.result/ok? (mc/insert db "people" doc)))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-with-explicitly-passed-database-without-id-and-with-default-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(dotimes [n 5]
|
||||
(is (monger.result/ok? (mc/insert db "people" doc WriteConcern/SAFE))))
|
||||
(is (= 5 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-explicit-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (monger.result/ok? (mc/insert db "people" doc WriteConcern/SAFE)))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-basic-db-object-without-id-and-with-default-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (.get ^DBObject doc "_id")))
|
||||
(mc/insert db "people" doc)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
(deftest insert-a-map-with-id-and-with-default-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
id (ObjectId.)
|
||||
doc {:name "Joe" :age 30 "_id" id}
|
||||
result (mc/insert db "people" doc)]
|
||||
(is (= id (monger.util/get-id doc)))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-ratio-in-it
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:ratio 11/2 "_id" id}
|
||||
result (mc/insert db "widgets" doc)]
|
||||
(is (= 5.5 (:ratio (mc/find-map-by-id db collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-it
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword :kwd "_id" id}
|
||||
result (mc/insert db "widgets" doc)]
|
||||
(is (= (name :kwd) (:keyword (mc/find-map-by-id db collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-a-set-in-it
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword1 {:keyword2 #{:kw1 :kw2}} "_id" id}
|
||||
result (mc/insert db "widgets" doc)]
|
||||
(is (= (sort ["kw1" "kw2"])
|
||||
(sort (get-in (mc/find-map-by-id db collection id) [:keyword1 :keyword2]))))))
|
||||
|
||||
(defrecord Metrics
|
||||
[rps eps])
|
||||
|
||||
(deftest insert-a-document-with-clojure-record-in-it
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:record (Metrics. 10 20) "_id" id}
|
||||
result (mc/insert db "widgets" doc)]
|
||||
(is (= {:rps 10 :eps 20} (:record (mc/find-map-by-id db collection id))))))
|
||||
|
||||
(deftest test-insert-a-document-with-dbref
|
||||
(let [db (mg/get-db "altdb")]
|
||||
(mc/remove db "widgets")
|
||||
(mc/remove db "owners")
|
||||
(let [coll1 "widgets"
|
||||
coll2 "owners"
|
||||
oid (ObjectId.)
|
||||
joe (mc/insert db "owners" {:name "Joe" :_id oid})
|
||||
dbref (DBRef. (mg/current-db) coll2 oid)]
|
||||
(mc/insert db coll1 {:type "pentagon" :owner dbref})
|
||||
(let [fetched (mc/find-one-as-map db coll1 {:type "pentagon"})
|
||||
fo (:owner fetched)]
|
||||
(is (= {:_id oid :name "Joe"} (from-db-object @fo true)))))))
|
||||
|
||||
|
||||
;;
|
||||
;; insert-and-return
|
||||
;;
|
||||
|
||||
(deftest insert-and-return-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc {:name "Joe" :age 30}
|
||||
result (mc/insert-and-return db :people doc)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-and-return-a-basic-document-without-id-but-with-a-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc {:name "Joe" :age 30 :ratio 3/4}
|
||||
result (mc/insert-and-return db "people" doc WriteConcern/FSYNC_SAFE)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (= (:ratio doc)
|
||||
(:ratio result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-and-return-with-a-provided-id
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
oid (ObjectId.)
|
||||
doc {:name "Joe" :age 30 :_id oid}
|
||||
result (mc/insert-and-return db :people doc)]
|
||||
(is (= (:_id result) (:_id doc) oid))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
|
||||
;;
|
||||
;; insert-batch
|
||||
;;
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-default-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (monger.result/ok? (mc/insert-batch db "people" docs)))
|
||||
(is (= 2 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-explicit-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (monger.result/ok? (mc/insert-batch db "people" docs WriteConcern/NORMAL)))
|
||||
(is (= 2 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-with-explicit-database-without-ids-and-with-explicit-write-concern
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(dotimes [n 44]
|
||||
(is (monger.result/ok? (mc/insert-batch db "people" docs WriteConcern/NORMAL))))
|
||||
(is (= 88 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-from-a-lazy-sequence
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
numbers (range 0 1000)]
|
||||
(is (monger.result/ok? (mc/insert-batch db "people" (map (fn [^long l]
|
||||
{:n l})
|
||||
numbers))))
|
||||
(is (= (count numbers) (mc/count db collection)))))
|
||||
183
test/monger/test/multi/updating_test.clj
Normal file
183
test/monger/test/multi/updating_test.clj
Normal file
|
|
@ -0,0 +1,183 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.multi.updating-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger core util]
|
||||
[monger.multi.collection :as mc]
|
||||
[monger.result :as mr]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.test.fixtures
|
||||
[monger.conversion :only [to-db-object]]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(defn drop-altdb
|
||||
[f]
|
||||
(mg/drop-db "altdb")
|
||||
(f))
|
||||
|
||||
(use-fixtures :each drop-altdb)
|
||||
|
||||
;;
|
||||
;; update, save
|
||||
;;
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (doc (mc/find-by-id db collection doc-id))))
|
||||
(mc/update db collection { :_id doc-id } { :language "Erlang" })
|
||||
(is (= (modified-doc (mc/find-by-id db collection doc-id))))))
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert-using-update-by-id
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (doc (mc/find-by-id db collection doc-id))))
|
||||
(mc/update-by-id db collection doc-id { :language "Erlang" })
|
||||
(is (= (modified-doc (mc/find-by-id db collection doc-id))))))
|
||||
|
||||
(deftest ^{:updating true} update-nested-document-fields-without-upsert-using-update-by-id
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
date (Date.)
|
||||
doc { :created-at date :data-store "MongoDB" :language { :primary "Clojure" } :_id doc-id }
|
||||
modified-doc { :created-at date :data-store "MongoDB" :language { :primary "Erlang" } :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (doc (mc/find-by-id db collection doc-id))))
|
||||
(mc/update-by-id db collection doc-id { $set { "language.primary" "Erlang" }})
|
||||
(is (= (modified-doc (mc/find-by-id db collection doc-id))))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} update-multiple-documents
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"]
|
||||
(mc/insert db collection { :language "Clojure", :name "monger" })
|
||||
(mc/insert db collection { :language "Clojure", :name "langohr" })
|
||||
(mc/insert db collection { :language "Clojure", :name "incanter" })
|
||||
(mc/insert db collection { :language "Scala", :name "akka" })
|
||||
(is (= 3 (mc/count db collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count db collection { :language "Scala" })))
|
||||
(is (= 0 (mc/count db collection { :language "Python" })))
|
||||
(mc/update db collection { :language "Clojure" } { $set { :language "Python" } } :multi true)
|
||||
(is (= 0 (mc/count db collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count db collection { :language "Scala" })))
|
||||
(is (= 3 (mc/count db collection { :language "Python" })))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} save-a-new-document
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
document {:name "Joe" :age 30}]
|
||||
(is (mr/ok? (mc/save db "people" document)))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest ^{:updating true} save-and-return-a-new-document
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
document {:name "Joe" :age 30}
|
||||
returned (mc/save-and-return db "people" document)]
|
||||
(is (:_id returned))
|
||||
(is (= document (dissoc returned :_id)))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} save-a-new-basic-db-object
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (monger.util/get-id doc)))
|
||||
(mc/save db "people" doc WriteConcern/SAFE)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc-id "people-1"
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mr/ok? (mc/insert db "people" document)))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(mc/save db collection { :_id doc-id, :name "Alan", :age 40 })
|
||||
(is (= 1 (mc/count db collection { :name "Alan", :age 40 })))))
|
||||
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save-and-return
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
document (mc/insert-and-return db "people" {:name "Joe" :age 30})
|
||||
doc-id (:_id document)
|
||||
updated (mc/save-and-return db collection {:_id doc-id :name "Alan" :age 40})]
|
||||
(is (= {:_id doc-id :name "Alan" :age 40} updated))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= 1 (mc/count db collection {:name "Alan" :age 40})))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} set-an-attribute-on-existing-document-using-update
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "people"
|
||||
doc-id (monger.util/object-id)
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mr/ok? (mc/insert db "people" document)))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= 0 (mc/count db collection { :has_kids true })))
|
||||
(mc/update db collection { :_id doc-id } { $set { :has_kids true } })
|
||||
(is (= 1 (mc/count db collection { :has_kids true })))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} increment-multiple-fields-using-exists-operator-and-update
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "matches"
|
||||
doc-id (monger.util/object-id)
|
||||
document { :_id doc-id :abc 0 :def 10 }]
|
||||
(mc/remove db collection)
|
||||
(is (mr/ok? (mc/insert db collection document)))
|
||||
(is (= 1 (mc/count db collection {:abc {$exists true} :def {$exists true}})))
|
||||
(mc/update db collection {:abc {$exists true} :def {$exists true}} {$inc {:abc 1 :def 0}})
|
||||
(is (= 1 (mc/count db collection { :abc 1 })))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:updating true} upsert-a-document-using-update
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(is (not (mr/updated-existing? (mc/update db collection { :language "Clojure" } doc :upsert true))))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (mr/updated-existing? (mc/update db collection { :language "Clojure" } modified-doc :multi false :upsert true)))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= (modified-doc (mc/find-by-id db collection doc-id))))
|
||||
(mc/remove db collection)))
|
||||
|
||||
(deftest ^{:updating true} upsert-a-document-using-upsert
|
||||
(let [db (mg/get-db "altdb")
|
||||
collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc {:created-at date :data-store "MongoDB" :language "Clojure" :_id doc-id}
|
||||
modified-doc {:created-at date :data-store "MongoDB" :language "Erlang" :_id doc-id}]
|
||||
(mc/remove db collection)
|
||||
(is (not (mr/updated-existing? (mc/upsert db collection {:language "Clojure"} doc))))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (mr/updated-existing? (mc/upsert db collection {:language "Clojure"} modified-doc :multi false)))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= (modified-doc (mc/find-by-id db collection doc-id))))
|
||||
(mc/remove db collection)))
|
||||
|
|
@ -1,146 +1,111 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.query-operators-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject MapReduceOutput MapReduceCommand MapReduceCommand$OutputType]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger core util]
|
||||
[clojure stacktrace]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[monger.js :as js]
|
||||
[clojure.test :refer :all]
|
||||
[clojure.set :refer [difference]]
|
||||
[monger.operators :refer :all])
|
||||
(:import [com.mongodb QueryOperators]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
;; (use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
(monger.core/connect!)
|
||||
(monger.core/set-db! (monger.core/get-db "monger-test"))
|
||||
|
||||
(deftest every-query-operator-is-defined
|
||||
(let [driver-query-operators (->> (.getDeclaredFields QueryOperators) (map #(.get % nil)) set)
|
||||
monger-query-operators (->> (ns-publics 'monger.operators) (map (comp name first)) set)
|
||||
; $within is deprecated and replaced by $geoWithin since v2.4.
|
||||
; $uniqueDocs is deprecated since v2.6.
|
||||
deprecated-query-operators #{"$within" "$uniqueDocs"}
|
||||
; Query modifier operators that are deprecated in the mongo shell since v3.2
|
||||
deprecated-meta-operators #{"$comment" "$explain" "$hint" "$maxScan"
|
||||
"$maxTimeMS" "$max" "$min" "$orderby"
|
||||
"$returnKey" "$showDiskLoc" "$snapshot" "$query"}
|
||||
undefined-non-deprecated-operators (difference driver-query-operators
|
||||
deprecated-query-operators
|
||||
deprecated-meta-operators
|
||||
monger-query-operators)]
|
||||
(is (= #{} undefined-non-deprecated-operators))))
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "libraries"))
|
||||
;;
|
||||
;; $gt, $gte, $lt, lte
|
||||
;;
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
(deftest find-with-conditional-operators-comparison
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{:language "Clojure" :name "monger" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(are [a b] (= a (.count (mgcol/find collection b)))
|
||||
2 {:users {$gt 10}}
|
||||
3 {:users {$gte 5}}
|
||||
2 {:users {$lt 10}}
|
||||
2 {:users {$lte 5}}
|
||||
1 {:users {$gt 10 $lt 150}})))
|
||||
|
||||
;;
|
||||
;; $gt, $gte, $lt, lte
|
||||
;;
|
||||
;;
|
||||
;; $ne
|
||||
;;
|
||||
|
||||
(deftest find-with-conditional-operators-comparison
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(are [a b] (= a (.count (mc/find db collection b)))
|
||||
2 {:users {$gt 10}}
|
||||
3 {:users {$gte 5}}
|
||||
2 {:users {$lt 10}}
|
||||
2 {:users {$lte 5}}
|
||||
1 {:users {$gt 10 $lt 150}})))
|
||||
|
||||
;;
|
||||
;; $eq
|
||||
;;
|
||||
|
||||
(deftest find-with-eq-operator
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "mongoid" :users 1 :displayName nil}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 2 (.count (mc/find db collection {:language {$eq "Clojure"}}))))))
|
||||
|
||||
;;
|
||||
;; $ne
|
||||
;;
|
||||
|
||||
(deftest find-with-ne-operator
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "mongoid" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 2 (.count (mc/find db collection {:language {$ne "Clojure"}}))))))
|
||||
(deftest find-with-and-or-operators
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{:language "Ruby" :name "mongoid" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 2 (.count (mgcol/find collection {$ne {:language "Clojure"}}))))))
|
||||
|
||||
|
||||
;;
|
||||
;; $and, $or, $nor
|
||||
;;
|
||||
;;
|
||||
;; $and, $or, $nor
|
||||
;;
|
||||
|
||||
(deftest find-with-and-or-operators
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "mongoid" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 1 (.count (mc/find db collection {$and [{:language "Clojure"}
|
||||
{:users {$gt 10}}]}))))
|
||||
(is (= 3 (.count (mc/find db collection {$or [{:language "Clojure"}
|
||||
{:users {$gt 10}} ]}))))
|
||||
(is (= 1 (.count (mc/find db collection {$nor [{:language "Clojure"}
|
||||
{:users {$gt 10}} ]}))))))
|
||||
(deftest find-with-and-or-operators
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{:language "Ruby" :name "mongoid" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 1 (.count (mgcol/find collection {$and [{:language "Clojure"}
|
||||
{:users {$gt 10}}]}))))
|
||||
(is (= 3 (.count (mgcol/find collection {$or [{:language "Clojure"}
|
||||
{:users {$gt 10}} ]}))))
|
||||
(is (= 1 (.count (mgcol/find collection {$nor [{:language "Clojure"}
|
||||
{:users {$gt 10}} ]}))))))
|
||||
|
||||
;;
|
||||
;; $all, $in, $nin
|
||||
;;
|
||||
;;
|
||||
;; $all, $in, $nin
|
||||
;;
|
||||
|
||||
(deftest find-on-embedded-arrays
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :tags [ "functional" ]}
|
||||
{:language "Scala" :tags [ "functional" "object-oriented" ]}
|
||||
{:language "Ruby" :tags [ "object-oriented" "dynamic" ]}])
|
||||
(deftest find-on-embedded-arrays
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{:language "Clojure" :tags [ "functional" ]}
|
||||
{:language "Scala" :tags [ "functional" "object-oriented" ]}
|
||||
{:language "Ruby" :tags [ "object-oriented" "dynamic" ]}])
|
||||
|
||||
(is (= "Scala" (:language (first (mc/find-maps db collection {:tags {$all [ "functional" "object-oriented" ]}} )))))
|
||||
(is (= 3 (.count (mc/find-maps db collection {:tags {$in [ "functional" "object-oriented" ]}} ))))
|
||||
(is (= 2 (.count (mc/find-maps db collection {:language {$in [ "Scala" "Ruby" ]}} ))))
|
||||
(is (= 1 (.count (mc/find-maps db collection {:tags {$nin [ "dynamic" "object-oriented" ]}} ))))
|
||||
(is (= 3 (.count (mc/find-maps db collection {:language {$nin [ "C#" ]}} ))))))
|
||||
(is (= "Scala" (:language (first (mgcol/find-maps collection {:tags {$all [ "functional" "object-oriented" ]}} )))))
|
||||
(is (= 3 (.count (mgcol/find-maps collection {:tags {$in [ "functional" "object-oriented" ]}} ))))
|
||||
(is (= 2 (.count (mgcol/find-maps collection {:language {$in [ "Scala" "Ruby" ]}} ))))
|
||||
(is (= 1 (.count (mgcol/find-maps collection {:tags {$nin [ "dynamic" "object-oriented" ]}} ))))
|
||||
(is (= 3 (.count (mgcol/find-maps collection {:language {$nin [ "C#" ]}} ))))))
|
||||
|
||||
|
||||
(deftest find-with-conditional-operators-on-embedded-documents
|
||||
(let [collection "people"]
|
||||
(mc/insert-batch db collection [{:name "Bob" :comments [{:text "Nice!" :rating 1}
|
||||
{:text "Love it" :rating 4}
|
||||
{:text "What?":rating -5} ]}
|
||||
{:name "Alice" :comments [{:text "Yeah" :rating 2}
|
||||
{:text "Doh" :rating 1}
|
||||
{:text "Agreed" :rating 3}]}])
|
||||
(are [a b] (= a (.count (mc/find db collection b)))
|
||||
1 {:comments {$elemMatch {:text "Nice!" :rating {$gte 1}}}}
|
||||
2 {"comments.rating" 1}
|
||||
1 {"comments.rating" {$gt 3}})))
|
||||
(deftest find-with-conditional-operators-on-embedded-documents
|
||||
(let [collection "people"]
|
||||
(mgcol/insert-batch collection [{:name "Bob" :comments [{:text "Nice!" :rating 1}
|
||||
{:text "Love it" :rating 4}
|
||||
{:text "What?":rating -5} ]}
|
||||
{:name "Alice" :comments [{:text "Yeah" :rating 2}
|
||||
{:text "Doh" :rating 1}
|
||||
{:text "Agreed" :rating 3}]}])
|
||||
(are [a b] (= a (.count (mgcol/find collection b)))
|
||||
1 {:comments {$elemMatch {:text "Nice!" :rating {$gte 1}}}}
|
||||
2 {"comments.rating" 1}
|
||||
1 {"comments.rating" {$gt 3}})))
|
||||
|
||||
(deftest find-with-regex-operator
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "Mongoid" :users 1}
|
||||
{:language "Clojure" :name "Langohr" :users 5}
|
||||
{:language "Clojure" :name "Incanter" :users 15}
|
||||
{:language "Scala" :name "Akka" :users 150}])
|
||||
(are [query results] (is (= results (.count (mc/find db collection query))))
|
||||
{:language {$regex "Clo.*"}} 2
|
||||
{:language {$regex "clo.*" $options "i"}} 2
|
||||
{:name {$regex "aK.*" $options "i"}} 1
|
||||
{:language {$regex ".*by"}} 1
|
||||
{:language {$regex ".*ala.*"}} 1)))
|
||||
|
||||
(deftest find-with-js-expression
|
||||
(let [collection "people"]
|
||||
(mc/insert-batch db collection [{:name "Bob" :placeOfBirth "New York" :address {:city "New York"}}
|
||||
{:name "Alice" :placeOfBirth "New York" :address {:city "Los Angeles"}}])
|
||||
(is (= 1 (.count (mc/find db collection {$where "this.placeOfBirth === this.address.city"})))))))
|
||||
(deftest find-with-regex-operator
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{:language "Ruby" :name "Mongoid" :users 1}
|
||||
{:language "Clojure" :name "Langohr" :users 5}
|
||||
{:language "Clojure" :name "Incanter" :users 15}
|
||||
{:language "Scala" :name "Akka" :users 150}])
|
||||
(are [query results] (is (= results (.count (mgcol/find collection query))))
|
||||
{:language {$regex "Clo.*"}} 2
|
||||
{:language {$regex "clo.*" $options "i"}} 2
|
||||
{:name {$regex "aK.*" $options "i"}} 1
|
||||
{:language {$regex ".*by"}} 1
|
||||
{:language {$regex ".*ala.*"}} 1)))
|
||||
|
|
|
|||
|
|
@ -1,325 +1,314 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.querying-test
|
||||
(:refer-clojure :exclude [select find sort])
|
||||
(:import [com.mongodb WriteResult WriteConcern DBObject ReadPreference]
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject ReadPreference]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
monger.joda-time
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[clojure.test :refer :all]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.query :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[clj-time.core :refer [date-time]]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.test.fixtures
|
||||
[monger conversion query operators]
|
||||
[clj-time.core :only [date-time]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(helper/connect!)
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "locations")
|
||||
(mc/remove db "querying_docs")
|
||||
(f)
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "locations")
|
||||
(mc/remove db "querying_docs"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
;;
|
||||
;; monger.collection/* finders ("low-level API")
|
||||
;;
|
||||
|
||||
;; by ObjectId
|
||||
|
||||
(deftest query-full-document-by-object-id
|
||||
(let [coll "querying_docs"
|
||||
oid (ObjectId.)
|
||||
doc { :_id oid :title "Introducing Monger" }]
|
||||
(mc/insert db coll doc)
|
||||
(is (= doc (mc/find-map-by-id db coll oid)))
|
||||
(is (= doc (mc/find-one-as-map db coll { :_id oid })))))
|
||||
(use-fixtures :each purge-docs purge-things purge-locations)
|
||||
|
||||
|
||||
;; exact match over string field
|
||||
;;
|
||||
;; monger.collection/* finders ("low-level API")
|
||||
;;
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-field
|
||||
(let [coll "querying_docs"
|
||||
doc { :title "monger" :language "Clojure" :_id (ObjectId.) }]
|
||||
(mc/insert db coll doc)
|
||||
(is (= [doc] (mc/find-maps db coll { :title "monger" })))
|
||||
(is (= doc (from-db-object (first (mc/find db coll { :title "monger" })) true)))))
|
||||
;; by ObjectId
|
||||
|
||||
(deftest query-full-document-by-object-id
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc { :_id oid :title "Introducing Monger" }]
|
||||
(mgcol/insert coll doc)
|
||||
(is (= doc (mgcol/find-map-by-id coll oid)))
|
||||
(is (= doc (mgcol/find-one-as-map coll { :_id oid })))))
|
||||
|
||||
|
||||
;; exact match over string field with limit
|
||||
;; exact match over string field
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-with-field-with-limit
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :title "monger" :language "Clojure" :_id (ObjectId.) }
|
||||
doc2 { :title "langohr" :language "Clojure" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result (with-collection db coll
|
||||
(find { :title "monger" })
|
||||
(fields [:title, :language, :_id])
|
||||
(skip 0)
|
||||
(limit 1))]
|
||||
(is (= 1 (count result)))
|
||||
(is (= [doc1] result))))
|
||||
(deftest query-full-document-using-exact-matching-over-string-field
|
||||
(let [coll "docs"
|
||||
doc { :title "monger" :language "Clojure" :_id (ObjectId.) }]
|
||||
(mgcol/insert coll doc)
|
||||
(is (= [doc] (mgcol/find-maps coll { :title "monger" })))
|
||||
(is (= doc (from-db-object (first (mgcol/find coll { :title "monger" })) true)))))
|
||||
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-field-with-limit-and-offset
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :title "lucene" :language "Java" :_id (ObjectId.) }
|
||||
doc2 { :title "joda-time" :language "Java" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result (with-collection db coll
|
||||
(find { :language "Java" })
|
||||
(skip 1)
|
||||
(limit 2)
|
||||
(sort { :title 1 }))]
|
||||
(is (= 2 (count result)))
|
||||
(is (= [doc1 doc3] result))))
|
||||
;; exact match over string field with limit
|
||||
|
||||
(deftest query-with-sorting-on-multiple-fields
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :a 1 :b 2 :c 3 :text "Whatever" :_id (ObjectId.) }
|
||||
doc2 { :a 1 :b 1 :c 4 :text "Blah " :_id (ObjectId.) }
|
||||
doc3 { :a 10 :b 3 :c 1 :text "Abc" :_id (ObjectId.) }
|
||||
doc4 { :a 10 :b 3 :c 3 :text "Abc" :_id (ObjectId.) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3 doc4])
|
||||
result1 (with-collection db coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (sorted-map :a 1 :b 1 :text -1)))
|
||||
result2 (with-collection db coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (array-map :c 1 :text -1)))
|
||||
result3 (with-collection db coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (array-map :c -1 :text 1)))]
|
||||
(is (= [doc2 doc1] result1))
|
||||
(is (= [doc3 doc1] result2))
|
||||
(is (= [doc2 doc4] result3))))
|
||||
(deftest query-full-document-using-exact-matching-over-string-with-field-with-limit
|
||||
(let [coll "docs"
|
||||
doc1 { :title "monger" :language "Clojure" :_id (ObjectId.) }
|
||||
doc2 { :title "langohr" :language "Clojure" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result (with-collection coll
|
||||
(find { :title "monger" })
|
||||
(fields [:title, :language, :_id])
|
||||
(skip 0)
|
||||
(limit 1))]
|
||||
(is (= 1 (count result)))
|
||||
(is (= [doc1] result))))
|
||||
|
||||
|
||||
;; < ($lt), <= ($lte), > ($gt), >= ($gte)
|
||||
(deftest query-full-document-using-exact-matching-over-string-field-with-limit-and-offset
|
||||
(let [coll "docs"
|
||||
doc1 { :title "lucene" :language "Java" :_id (ObjectId.) }
|
||||
doc2 { :title "joda-time" :language "Java" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result (with-collection coll
|
||||
(find { :language "Java" })
|
||||
(skip 1)
|
||||
(limit 2)
|
||||
(sort { :title 1 }))]
|
||||
(is (= 2 (count result)))
|
||||
(is (= [doc1 doc3] result))))
|
||||
|
||||
(deftest query-using-dsl-and-$lt-operator-with-integers
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
lt-result (with-collection db coll
|
||||
(find { :inception_year { $lt 2000 } })
|
||||
(limit 2))]
|
||||
(is (= [doc2] (vec lt-result)))))
|
||||
(deftest query-with-sorting-on-multiple-fields
|
||||
(let [coll "docs"
|
||||
doc1 { :a 1 :b 2 :c 3 :text "Whatever" :_id (ObjectId.) }
|
||||
doc2 { :a 1 :b 1 :c 4 :text "Blah " :_id (ObjectId.) }
|
||||
doc3 { :a 10 :b 3 :c 1 :text "Abc" :_id (ObjectId.) }
|
||||
doc4 { :a 10 :b 3 :c 3 :text "Abc" :_id (ObjectId.) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3 doc4])
|
||||
result1 (with-collection coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (sorted-map :a 1 :b 1 :text -1)))
|
||||
result2 (with-collection coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (array-map :c 1 :text -1)))
|
||||
result3 (with-collection coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (array-map :c -1 :text 1)))]
|
||||
(is (= [doc2 doc1] result1))
|
||||
(is (= [doc3 doc1] result2))
|
||||
(is (= [doc2 doc4] result3))))
|
||||
|
||||
|
||||
(deftest query-using-dsl-and-$lt-operator-with-dates
|
||||
(let [coll "querying_docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
lt-result (with-collection db coll
|
||||
(find { :inception_year { $lt (date-time 2000 1 2) } })
|
||||
(limit 2))]
|
||||
(is (= (map :_id [doc2])
|
||||
(map :_id (vec lt-result))))))
|
||||
;; < ($lt), <= ($lte), > ($gt), >= ($gte)
|
||||
|
||||
(deftest query-using-both-$lte-and-$gte-operators-with-dates
|
||||
(let [coll "querying_docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
lt-result (with-collection db coll
|
||||
(find { :inception_year { $gt (date-time 2000 1 2) $lte (date-time 2007 2 2) } })
|
||||
(sort { :inception_year 1 }))]
|
||||
(is (= (map :_id [doc3 doc1])
|
||||
(map :_id (vec lt-result))))))
|
||||
(deftest query-using-dsl-and-$lt-operator-with-integers
|
||||
(let [coll "docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
lt-result (with-collection "docs"
|
||||
(find { :inception_year { $lt 2000 } })
|
||||
(limit 2))]
|
||||
(is (= [doc2] (vec lt-result)))))
|
||||
|
||||
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-as-strings
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])]
|
||||
(are [doc, result]
|
||||
(= doc, result)
|
||||
(doc2 (with-collection db coll
|
||||
(find { :inception_year { "$lt" 2000 } })))
|
||||
(doc2 (with-collection db coll
|
||||
(find { :inception_year { "$lte" 1992 } })))
|
||||
(doc1 (with-collection db coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 })))
|
||||
(doc1 (with-collection db coll
|
||||
(find { :inception_year { "$gte" 2006 } }))))))
|
||||
(deftest query-using-dsl-and-$lt-operator-with-dates
|
||||
(let [coll "docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
lt-result (with-collection "docs"
|
||||
(find { :inception_year { $lt (date-time 2000 1 2) } })
|
||||
(limit 2))]
|
||||
(is (= (map :_id [doc2])
|
||||
(map :_id (vec lt-result))))))
|
||||
|
||||
(deftest query-using-both-$lte-and-$gte-operators-with-dates
|
||||
(let [coll "docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
lt-result (with-collection "docs"
|
||||
(find { :inception_year { $gt (date-time 2000 1 2) $lte (date-time 2007 2 2) } })
|
||||
(sort { :inception_year 1 }))]
|
||||
(is (= (map :_id [doc3 doc1])
|
||||
(map :_id (vec lt-result))))))
|
||||
|
||||
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-using-dsl-composition
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
srt (-> {}
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 }))
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])]
|
||||
(is (= [doc1] (with-collection db coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(merge srt))))))
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-as-strings
|
||||
(let [coll "docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])]
|
||||
(are [doc, result]
|
||||
(= doc, result)
|
||||
(doc2 (with-collection coll
|
||||
(find { :inception_year { "$lt" 2000 } })))
|
||||
(doc2 (with-collection coll
|
||||
(find { :inception_year { "$lte" 1992 } })))
|
||||
(doc1 (with-collection coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 })))
|
||||
(doc1 (with-collection coll
|
||||
(find { :inception_year { "$gte" 2006 } }))))))
|
||||
|
||||
|
||||
;; $all
|
||||
|
||||
(deftest query-with-using-$all
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
- (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result1 (with-collection db coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "homoiconic"] } }))
|
||||
result2 (with-collection db coll
|
||||
(find { :tags { "$all" ["functional" "native" "homoiconic"] } }))
|
||||
result3 (with-collection db coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "dsls"] } })
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1] result1))
|
||||
(is (empty? result2))
|
||||
(is (= 2 (count result3)))
|
||||
(is (= doc1 (first result3)))))
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-using-dsl-composition
|
||||
(let [coll "docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
srt (-> {}
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 }))
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])]
|
||||
(is (= [doc1] (with-collection coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(merge srt))))))
|
||||
|
||||
|
||||
;; $exists
|
||||
;; $all
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$exists
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :published-by "Jill The Blogger" :draft false :title "X announces another Y" }
|
||||
doc2 { :_id (ObjectId.) :draft true :title "Z announces a Y competitor" }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
result1 (mc/find-one-as-map db coll { :published-by { "$exists" true } })
|
||||
result2 (mc/find-one-as-map db coll { :published-by { "$exists" false } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))))
|
||||
|
||||
;; $mod
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$mod
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
doc3 { :_id (ObjectId.) :counter 63 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result1 (mc/find-one-as-map db coll { :counter { "$mod" [10, 5] } })
|
||||
result2 (mc/find-one-as-map db coll { :counter { "$mod" [10, 2] } })
|
||||
result3 (mc/find-one-as-map db coll { :counter { "$mod" [11, 1] } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))
|
||||
(is (empty? result3))))
|
||||
(deftest query-with-using-$all
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
- (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result1 (with-collection coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "homoiconic"] } }))
|
||||
result2 (with-collection coll
|
||||
(find { :tags { "$all" ["functional" "native" "homoiconic"] } }))
|
||||
result3 (with-collection coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "dsls"] } })
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1] result1))
|
||||
(is (empty? result2))
|
||||
(is (= 2 (count result3)))
|
||||
(is (= doc1 (first result3)))))
|
||||
|
||||
|
||||
;; $ne
|
||||
;; $exists
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$ne
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
result1 (mc/find-one-as-map db coll { :counter { "$ne" 25 } })
|
||||
result2 (mc/find-one-as-map db coll { :counter { "$ne" 32 } })]
|
||||
(is (= doc2 result1))
|
||||
(is (= doc1 result2))))
|
||||
(deftest query-with-find-one-as-map-using-$exists
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :published-by "Jill The Blogger" :draft false :title "X announces another Y" }
|
||||
doc2 { :_id (ObjectId.) :draft true :title "Z announces a Y competitor" }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
result1 (mgcol/find-one-as-map coll { :published-by { "$exists" true } })
|
||||
result2 (mgcol/find-one-as-map coll { :published-by { "$exists" false } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))))
|
||||
|
||||
;;
|
||||
;; monger.query DSL features
|
||||
;;
|
||||
;; $mod
|
||||
|
||||
;; pagination
|
||||
(deftest query-using-pagination-dsl
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc4 { :_id (ObjectId.) :title "Ruby" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc5 { :_id (ObjectId.) :title "Groovy" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc6 { :_id (ObjectId.) :title "OCaml" :tags ["functional" "static" "dsls"] }
|
||||
doc7 { :_id (ObjectId.) :title "Haskell" :tags ["functional" "static" "dsls" "concurrency features"] }
|
||||
- (mc/insert-batch db coll [doc1 doc2 doc3 doc4 doc5 doc6 doc7])
|
||||
result1 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 1 :per-page 3)
|
||||
(sort { :title 1 })
|
||||
(read-preference (ReadPreference/primary))
|
||||
(options com.mongodb.Bytes/QUERYOPTION_NOTIMEOUT))
|
||||
result2 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 2 :per-page 3)
|
||||
(sort { :title 1 }))
|
||||
result3 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 3 :per-page 3)
|
||||
(sort { :title 1 }))
|
||||
result4 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 10 :per-page 3)
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1 doc5 doc7] result1))
|
||||
(is (= [doc2 doc6 doc4] result2))
|
||||
(is (= [doc3] result3))
|
||||
(is (empty? result4))))
|
||||
(deftest query-with-find-one-as-map-using-$mod
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
doc3 { :_id (ObjectId.) :counter 63 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result1 (mgcol/find-one-as-map coll { :counter { "$mod" [10, 5] } })
|
||||
result2 (mgcol/find-one-as-map coll { :counter { "$mod" [10, 2] } })
|
||||
result3 (mgcol/find-one-as-map coll { :counter { "$mod" [11, 1] } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))
|
||||
(is (empty? result3))))
|
||||
|
||||
|
||||
(deftest combined-querying-dsl-example1
|
||||
(let [coll "querying_docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mc/insert-batch db coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection db coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc))]
|
||||
(is (= result [ca-doc tx-doc ny-doc]))))
|
||||
;; $ne
|
||||
|
||||
(deftest combined-querying-dsl-example2
|
||||
(let [coll "querying_docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mc/insert-batch db coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection db coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc)
|
||||
(keywordize-fields false))]
|
||||
;; documents have fields as strings,
|
||||
;; not keywords
|
||||
(is (= (map #(% "name") result)
|
||||
(map #(% :name) [ca-doc tx-doc ny-doc]))))))
|
||||
(deftest query-with-find-one-as-map-using-$ne
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
result1 (mgcol/find-one-as-map coll { :counter { "$ne" 25 } })
|
||||
result2 (mgcol/find-one-as-map coll { :counter { "$ne" 32 } })]
|
||||
(is (= doc2 result1))
|
||||
(is (= doc1 result2))))
|
||||
|
||||
;;
|
||||
;; monger.query DSL features
|
||||
;;
|
||||
|
||||
;; pagination
|
||||
(deftest query-using-pagination-dsl
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc4 { :_id (ObjectId.) :title "Ruby" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc5 { :_id (ObjectId.) :title "Groovy" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc6 { :_id (ObjectId.) :title "OCaml" :tags ["functional" "static" "dsls"] }
|
||||
doc7 { :_id (ObjectId.) :title "Haskell" :tags ["functional" "static" "dsls" "concurrency features"] }
|
||||
- (mgcol/insert-batch coll [doc1 doc2 doc3 doc4 doc5 doc6 doc7])
|
||||
result1 (with-collection coll
|
||||
(find {})
|
||||
(paginate :page 1 :per-page 3)
|
||||
(sort { :title 1 })
|
||||
(read-preference (ReadPreference/primary))
|
||||
(options com.mongodb.Bytes/QUERYOPTION_NOTIMEOUT))
|
||||
result2 (with-collection coll
|
||||
(find {})
|
||||
(paginate :page 2 :per-page 3)
|
||||
(sort { :title 1 }))
|
||||
result3 (with-collection coll
|
||||
(find {})
|
||||
(paginate :page 3 :per-page 3)
|
||||
(sort { :title 1 }))
|
||||
result4 (with-collection coll
|
||||
(find {})
|
||||
(paginate :page 10 :per-page 3)
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1 doc5 doc7] result1))
|
||||
(is (= [doc2 doc6 doc4] result2))
|
||||
(is (= [doc3] result3))
|
||||
(is (empty? result4))))
|
||||
|
||||
|
||||
(deftest combined-querying-dsl-example1
|
||||
(let [coll "docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mgcol/insert-batch coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc))]
|
||||
(is (= result [ca-doc tx-doc ny-doc]))))
|
||||
|
||||
(deftest combined-querying-dsl-example2
|
||||
(let [coll "docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mgcol/insert-batch coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc)
|
||||
(keywordize-fields false))]
|
||||
;; documents have fields as strings,
|
||||
;; not keywords
|
||||
(is (= (map #(% "name") result)
|
||||
(map #(% :name) [ca-doc tx-doc ny-doc])))))
|
||||
|
|
|
|||
|
|
@ -1,55 +1,57 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.ragtime-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
monger.ragtime
|
||||
[ragtime.protocols :refer :all]
|
||||
[clojure.test :refer :all]))
|
||||
[monger.test.helper :as helper]
|
||||
monger.ragtime)
|
||||
(:use clojure.test
|
||||
[monger.test.fixtures :only [purge-migrations]]
|
||||
ragtime.core))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "meta.migrations")
|
||||
(f)
|
||||
(mc/remove db "meta.migrations"))
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
(use-fixtures :each purge-migrations)
|
||||
|
||||
(when-not (get (System/getenv) "CI")
|
||||
(deftest test-add-migration-id
|
||||
(let [coll "meta.migrations"
|
||||
key "1"]
|
||||
(mc/remove db coll {})
|
||||
(is (not (mc/any? db coll {:_id key})))
|
||||
(is (not (some #{key} (applied-migration-ids db))))
|
||||
(add-migration-id db key)
|
||||
|
||||
(when-not (get (System/getenv) "CI")
|
||||
(deftest test-add-migration-id
|
||||
(let [db (mg/get-db "monger-test")
|
||||
coll "meta.migrations"
|
||||
key "1"]
|
||||
(mc/remove db coll {})
|
||||
(is (not (mc/any? db coll {:_id key})))
|
||||
(is (not (some #{key} (applied-migration-ids db))))
|
||||
(add-migration-id db key)
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db)))))
|
||||
|
||||
|
||||
(deftest test-remove-migration-id
|
||||
(let [db (mg/get-db "monger-test")
|
||||
coll "meta.migrations"
|
||||
key "1"]
|
||||
(mc/remove db coll {})
|
||||
(add-migration-id db key)
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db)))
|
||||
(remove-migration-id db key)
|
||||
(is (not (some #{key} (applied-migration-ids db))))))
|
||||
|
||||
|
||||
(deftest test-migrations-ordering
|
||||
(let [db (mg/get-db "monger-test")
|
||||
coll "meta.migrations"
|
||||
all-keys [ "9" "4" "7" "1" "5" "3" "6" "2" "8"]]
|
||||
(mc/remove db coll {})
|
||||
|
||||
(doseq [key all-keys]
|
||||
(add-migration-id db key))
|
||||
|
||||
(doseq [key all-keys]
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db)))))
|
||||
(is (some #{key} (applied-migration-ids db))))
|
||||
|
||||
|
||||
(deftest test-remove-migration-id
|
||||
(let [coll "meta.migrations"
|
||||
key "1"]
|
||||
(mc/remove db coll {})
|
||||
(add-migration-id db key)
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db)))
|
||||
(remove-migration-id db key)
|
||||
(is (not (some #{key} (applied-migration-ids db))))))
|
||||
|
||||
|
||||
(deftest test-migrations-ordering
|
||||
(let [coll "meta.migrations"
|
||||
all-keys [ "9" "4" "7" "1" "5" "3" "6" "2" "8"]]
|
||||
(mc/remove db coll {})
|
||||
|
||||
(doseq [key all-keys]
|
||||
(add-migration-id db key))
|
||||
|
||||
(doseq [key all-keys]
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db))))
|
||||
|
||||
(testing "Applied migrations must come out in creation order"
|
||||
(is (= all-keys (applied-migration-ids db))))))))
|
||||
(testing "Applied migrations must come out in creation order"
|
||||
(is (= all-keys (applied-migration-ids db)))))))
|
||||
|
|
|
|||
|
|
@ -1,293 +1,276 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.regular-finders-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.util :as mu]
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer [to-db-object]]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.test.fixtures))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(use-fixtures :each (fn [f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "regular_finders_docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "regular_finders_docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")))
|
||||
(helper/connect!)
|
||||
|
||||
;;
|
||||
;; find-one
|
||||
;;
|
||||
|
||||
(deftest find-one-full-document-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"]
|
||||
(is (nil? (mc/find-one db collection {})))))
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"]
|
||||
(mc/remove db collection)
|
||||
(is (nil? (mc/find-one-as-map db collection {})))))
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
(deftest find-one-full-document-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
found-one (mc/find-one db collection { :language "Clojure" })]
|
||||
(is found-one)
|
||||
(is (= (:_id doc) (mu/get-id found-one)))
|
||||
(is (= (mgcnv/from-db-object found-one true) doc))
|
||||
(is (= (mgcnv/to-db-object doc) found-one))))
|
||||
;;
|
||||
;; find-one
|
||||
;;
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= doc (mc/find-one-as-map db collection { :language "Clojure" })))))
|
||||
(deftest find-one-full-document-when-collection-is-empty
|
||||
(let [collection "docs"]
|
||||
(is (nil? (mgcol/find-one collection {})))))
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-is-empty
|
||||
(let [collection "docs"]
|
||||
(is (nil? (mgcol/find-one-as-map collection {})))))
|
||||
|
||||
|
||||
(deftest find-one-full-document-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mgcol/insert collection doc)
|
||||
found-one (mgcol/find-one collection { :language "Clojure" })]
|
||||
(is (= (:_id doc) (monger.util/get-id found-one)))
|
||||
(is (= (mgcnv/from-db-object found-one true) doc))
|
||||
(is (= (mgcnv/to-db-object doc) found-one))))
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= doc (mgcol/find-one-as-map collection { :language "Clojure" })))))
|
||||
|
||||
|
||||
|
||||
(deftest find-one-partial-document-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
loaded (mc/find-one db collection { :language "Clojure" } [:language])]
|
||||
(is (nil? (.get ^DBObject loaded "data-store")))
|
||||
(is (= doc-id (mu/get-id loaded)))
|
||||
(is (= "Clojure" (.get ^DBObject loaded "language")))))
|
||||
(deftest find-one-partial-document-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-one collection { :language "Clojure" } [:language])]
|
||||
(is (nil? (.get ^DBObject loaded "data-store")))
|
||||
(is (= doc-id (monger.util/get-id loaded)))
|
||||
(is (= "Clojure" (.get ^DBObject loaded "language")))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-using-field-negation-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
^DBObject loaded (mc/find-one db collection { :language "Clojure" } {:data-store 0 :_id 0})]
|
||||
(is (nil? (.get loaded "data-store")))
|
||||
(is (nil? (.get loaded "_id")))
|
||||
(is (nil? (mu/get-id loaded)))
|
||||
(is (= "Clojure" (.get loaded "language")))))
|
||||
(deftest find-one-partial-document-using-field-negation-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mgcol/insert collection doc)
|
||||
^DBObject loaded (mgcol/find-one collection { :language "Clojure" } {:data-store 0 :_id 0})]
|
||||
(is (nil? (.get loaded "data-store")))
|
||||
(is (nil? (.get loaded "_id")))
|
||||
(is (nil? (monger.util/get-id loaded)))
|
||||
(is (= "Clojure" (.get loaded "language")))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= { :data-store "MongoDB", :_id doc-id }
|
||||
(mc/find-one-as-map db collection { :language "Clojure" } [:data-store])))))
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= { :data-store "MongoDB", :_id doc-id } (mgcol/find-one-as-map collection { :language "Clojure" } [:data-store])))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mc/insert db collection doc)
|
||||
loaded (mc/find-one-as-map db collection { :language "Clojure" } fields true)
|
||||
]
|
||||
(is (= { :data-store "MongoDB", :_id doc-id } loaded ))))
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-one-as-map collection { :language "Clojure" } fields true)
|
||||
]
|
||||
(is (= { :data-store "MongoDB", :_id doc-id } loaded ))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize-false
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mc/insert db collection doc)
|
||||
loaded (mc/find-one-as-map db collection { :language "Clojure" } fields false)]
|
||||
(is (= { "_id" doc-id, "data-store" "MongoDB" } loaded ))))
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize-false
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-one-as-map collection { :language "Clojure" } fields false)
|
||||
]
|
||||
(is (= { "_id" doc-id, "data-store" "MongoDB" } loaded ))))
|
||||
|
||||
;;
|
||||
;; find-by-id
|
||||
;;
|
||||
;;
|
||||
;; find-by-id
|
||||
;;
|
||||
|
||||
(deftest find-full-document-by-string-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)]
|
||||
(is (nil? (mc/find-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-by-string-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)]
|
||||
(is (nil? (mgcol/find-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-string-id-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException (mc/find-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-by-string-id-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException (mgcol/find-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-object-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)]
|
||||
(is (nil? (mc/find-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-by-object-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)]
|
||||
(is (nil? (mgcol/find-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-id-as-map-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)]
|
||||
(is (nil? (mc/find-map-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-by-id-as-map-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)]
|
||||
(is (nil? (mgcol/find-map-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-id-as-map-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException
|
||||
(mc/find-map-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-by-id-as-map-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException
|
||||
(mgcol/find-map-by-id collection doc-id)))))
|
||||
|
||||
|
||||
(deftest find-full-document-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-full-document-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-full-document-map-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= doc (mc/find-map-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-map-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-map-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-full-document-map-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= doc (mc/find-map-by-id db collection doc-id)))))
|
||||
(deftest find-full-document-map-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-map-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-partial-document-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object { :_id doc-id :language "Clojure" })
|
||||
(mc/find-by-id db collection doc-id [ :language ])))))
|
||||
(deftest find-partial-document-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= ({ :language "Clojure" } (mgcol/find-by-id collection doc-id [ :language ]))))))
|
||||
|
||||
|
||||
(deftest find-partial-document-as-map-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
fields [:data-store]
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
loaded (mc/find-map-by-id db collection doc-id [ :language ])]
|
||||
(is (= { :language "Clojure", :_id doc-id } loaded ))))
|
||||
(deftest find-partial-document-as-map-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
fields [:data-store]
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-map-by-id collection doc-id [ :language ])]
|
||||
(is (= { :language "Clojure", :_id doc-id } loaded ))
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
;;
|
||||
;; find
|
||||
;;
|
||||
;;
|
||||
;; find
|
||||
;;
|
||||
|
||||
(deftest find-full-document-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"
|
||||
cursor (mc/find db collection)]
|
||||
(is (empty? (iterator-seq cursor)))))
|
||||
(deftest find-full-document-when-collection-is-empty
|
||||
(let [collection "docs"
|
||||
cursor (mgcol/find collection)]
|
||||
(is (empty? (iterator-seq cursor)))))
|
||||
|
||||
(deftest find-document-seq-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"]
|
||||
(is (empty? (mc/find-seq db collection)))))
|
||||
(deftest find-document-seq-when-collection-is-empty
|
||||
(let [collection "docs"]
|
||||
(is (empty? (mgcol/find-seq collection)))))
|
||||
|
||||
(deftest find-multiple-documents-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mc/find db collection { :language "Scala" })))))
|
||||
(deftest find-multiple-documents-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mgcol/find collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-maps-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mc/find-maps db collection { :language "Scala" })))))
|
||||
(deftest find-multiple-maps-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mgcol/find-maps collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-documents-by-regex
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Java", :name "nhibernate" }
|
||||
{ :language "JavaScript", :name "sprout-core" }])
|
||||
(is (= 2 (monger.core/count (mc/find db collection { :language #"Java*" }))))))
|
||||
(deftest find-multiple-documents-by-regex
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Java", :name "nhibernate" }
|
||||
{ :language "JavaScript", :name "sprout-core" }])
|
||||
(is (= 2 (monger.core/count (mgcol/find collection { :language #"Java*" }))))))
|
||||
|
||||
(deftest find-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (monger.core/count (mc/find db collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mc/find db collection { :language "Clojure" }))))
|
||||
(is (empty? (mc/find db collection { :language "Java" })))))
|
||||
(deftest find-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (monger.core/count (mgcol/find collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mgcol/find collection { :language "Clojure" }))))
|
||||
(is (empty? (mgcol/find collection { :language "Java" })))))
|
||||
|
||||
|
||||
(deftest find-document-specify-fields
|
||||
(let [collection "libraries"
|
||||
_ (mc/insert db collection { :language "Clojure", :name "monger" })
|
||||
result (mc/find db collection { :language "Clojure"} [:language])]
|
||||
(is (= (set [:_id :language]) (-> (mgcnv/from-db-object (.next result) true) keys set)))))
|
||||
(deftest find-document-specify-fields
|
||||
(let [collection "libraries"
|
||||
_ (mgcol/insert collection { :language "Clojure", :name "monger" })
|
||||
result (mgcol/find collection { :language "Clojure"} [:language])]
|
||||
(is (= (seq [:_id :language]) (keys (mgcnv/from-db-object (.next result) true))))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents-the-hard-way
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (map (fn [dbo]
|
||||
(mgcnv/from-db-object dbo true))
|
||||
(mc/find-seq db collection { :language "Clojure" })))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
(deftest find-and-iterate-over-multiple-documents-the-hard-way
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (map (fn [dbo]
|
||||
(mgcnv/from-db-object dbo true))
|
||||
(mgcol/find-seq collection { :language "Clojure" })))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (mc/find-maps db collection { :language "Clojure" }))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
(deftest find-and-iterate-over-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (mgcol/find-maps collection { :language "Clojure" }))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
|
||||
(deftest find-multiple-maps
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (clojure.core/count (mc/find-maps db collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mc/find-maps db collection { :language "Clojure" }))))
|
||||
(is (empty? (mc/find-maps db collection { :language "Java" })))
|
||||
(is (empty? (mc/find-maps db collection { :language "Java" } [:language :name])))))
|
||||
(deftest find-multiple-maps
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (clojure.core/count (mgcol/find-maps collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mgcol/find-maps collection { :language "Clojure" }))))
|
||||
(is (empty? (mgcol/find-maps collection { :language "Java" })))
|
||||
(is (empty? (mgcol/find-maps monger.core/*mongodb-database* collection { :language "Java" } [:language :name])))))
|
||||
|
||||
|
||||
|
||||
(deftest find-multiple-partial-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(let [scala-libs (mc/find db collection { :language "Scala" } [:name])
|
||||
clojure-libs (mc/find db collection { :language "Clojure"} [:language])]
|
||||
(is (= 1 (.count scala-libs)))
|
||||
(is (= 3 (.count clojure-libs)))
|
||||
(doseq [i clojure-libs]
|
||||
(let [doc (mgcnv/from-db-object i true)]
|
||||
(is (= (:language doc) "Clojure"))))
|
||||
(is (empty? (mc/find db collection { :language "Erlang" } [:name]))))))
|
||||
|
||||
(deftest find-maps-with-keywordize-false
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }])
|
||||
(let [results (mc/find-maps db collection {:name "langohr"} [] false)]
|
||||
(is (= 1 (.count results)))
|
||||
(is (= (get (first results) "language") "Clojure"))))))
|
||||
(deftest find-multiple-partial-documents
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(let [scala-libs (mgcol/find collection { :language "Scala" } [:name])
|
||||
clojure-libs (mgcol/find collection { :language "Clojure"} [:language])]
|
||||
(is (= 1 (.count scala-libs)))
|
||||
(is (= 3 (.count clojure-libs)))
|
||||
(doseq [i clojure-libs]
|
||||
(let [doc (mgcnv/from-db-object i true)]
|
||||
(is (= (:language doc) "Clojure"))))
|
||||
(is (empty? (mgcol/find collection { :language "Erlang" } [:name]))))))
|
||||
|
|
|
|||
|
|
@ -1,25 +1,54 @@
|
|||
(ns monger.test.result-test
|
||||
(:import [com.mongodb BasicDBObject WriteResult WriteConcern] java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.result :as mgres]
|
||||
monger.util
|
||||
[clojure.test :refer :all]))
|
||||
(:require [monger core collection conversion]
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest test-updated-existing?-with-write-result
|
||||
(mc/remove db "libraries")
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date :data-store "MongoDB" :language "Clojure" :_id doc-id }
|
||||
modified-doc { :created-at date :data-store "MongoDB" :language "Erlang" :_id doc-id }]
|
||||
(let [result (mc/update db collection { :language "Clojure" } doc {:upsert true})]
|
||||
(is (not (mgres/updated-existing? result)))
|
||||
(is (= 1 (mgres/affected-count result))))
|
||||
(is (mgres/updated-existing? (mc/update db collection { :language "Clojure" } doc {:upsert true})))
|
||||
(is (mgres/updated-existing? (mc/update db collection { :language "Clojure" } modified-doc {:multi false :upsert true})))
|
||||
(is (= 1 (mgres/affected-count (mc/remove db collection { :_id doc-id }))))
|
||||
(mc/remove db collection)
|
||||
(mg/disconnect conn))))
|
||||
(helper/connect!)
|
||||
|
||||
;;
|
||||
;; MongoCommandResult
|
||||
;;
|
||||
|
||||
|
||||
(deftest test-ok?
|
||||
(let [result-that-is-not-ok-1 (doto (BasicDBObject.) (.put "ok" 0))
|
||||
result-that-is-not-ok-2 (doto (BasicDBObject.) (.put "ok" "false"))
|
||||
result-that-is-ok-1 (doto (BasicDBObject.) (.put "ok" 1))
|
||||
result-that-is-ok-2 (doto (BasicDBObject.) (.put "ok" "true"))
|
||||
result-that-is-ok-3 (doto (BasicDBObject.) (.put "ok" 1.0))]
|
||||
(is (not (monger.result/ok? result-that-is-not-ok-1)))
|
||||
(is (not (monger.result/ok? result-that-is-not-ok-2)))
|
||||
(is (monger.result/ok? result-that-is-ok-1))
|
||||
(is (monger.result/ok? result-that-is-ok-2))
|
||||
(is (monger.result/ok? result-that-is-ok-3))))
|
||||
|
||||
|
||||
(deftest test-has-error?
|
||||
(let [result-that-has-no-error1 (doto (BasicDBObject.) (.put "ok" 0))
|
||||
result-that-has-no-error2 (doto (BasicDBObject.) (.put "err" ""))
|
||||
result-that-has-error1 (doto (BasicDBObject.) (.put "err" (BasicDBObject.)))]
|
||||
(is (not (monger.result/has-error? result-that-has-no-error1)))
|
||||
(is (not (monger.result/has-error? result-that-has-no-error2)))
|
||||
(is (monger.result/has-error? result-that-has-error1))))
|
||||
|
||||
|
||||
(deftest test-updated-existing?-with-db-object
|
||||
(let [input1 (doto (BasicDBObject.) (.put "updatedExisting" true))
|
||||
input2 (doto (BasicDBObject.) (.put "updatedExisting" false))
|
||||
input3 (BasicDBObject.)]
|
||||
(is (monger.result/updated-existing? input1))
|
||||
(is (not (monger.result/updated-existing? input2)))
|
||||
(is (not (monger.result/updated-existing? input3)))))
|
||||
|
||||
(deftest test-updated-existing?-with-write-result
|
||||
(monger.collection/remove "libraries")
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(is (not (monger.result/updated-existing? (monger.collection/update collection { :language "Clojure" } doc :upsert true))))
|
||||
(is (monger.result/updated-existing? (monger.collection/update collection { :language "Clojure" } doc :upsert true)))
|
||||
(monger.result/updated-existing? (monger.collection/update collection { :language "Clojure" } modified-doc :multi false :upsert true))
|
||||
(monger.collection/remove collection)))
|
||||
|
|
|
|||
|
|
@ -1,50 +1,92 @@
|
|||
(ns monger.test.ring.clojure-session-store-test
|
||||
(:require [monger.core :as mg]
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[ring.middleware.session.store :refer :all]
|
||||
[monger.ring.session-store :refer :all]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
ring.middleware.session.store
|
||||
monger.ring.session-store))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-sessions
|
||||
[f]
|
||||
(mc/remove db "sessions")
|
||||
(f)
|
||||
(mc/remove db "sessions"))
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-sessions)
|
||||
(defn purge-sessions
|
||||
[f]
|
||||
(mc/remove "web_sessions")
|
||||
(mc/remove "sessions")
|
||||
(f)
|
||||
(mc/remove "web_sessions")
|
||||
(mc/remove "sessions"))
|
||||
|
||||
(use-fixtures :each purge-sessions)
|
||||
|
||||
|
||||
(deftest test-reading-a-session-that-does-not-exist
|
||||
(let [store (session-store db "sessions")]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
(deftest test-reading-a-session-that-does-not-exist
|
||||
(let [store (session-store)]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
|
||||
|
||||
(deftest test-reading-a-session-that-does-exist
|
||||
(let [store (session-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m)))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Monger"}))))
|
||||
(deftest test-reading-a-session-that-does-not-exist-given-db
|
||||
(let [db (monger.core/get-db)
|
||||
store (session-store db "sessions")]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
|
||||
|
||||
(deftest test-updating-a-session
|
||||
(let [store (session-store db "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Ring"}))))
|
||||
(deftest test-reading-a-session-that-does-exist
|
||||
(let [store (session-store)
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m)))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Monger"}))))
|
||||
|
||||
(deftest test-deleting-a-session
|
||||
(let [store (session-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk))))))
|
||||
|
||||
(deftest test-reading-a-session-that-does-exist-given-db
|
||||
(let [db (monger.core/get-db)
|
||||
store (session-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m)))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Monger"}))))
|
||||
|
||||
|
||||
(deftest test-updating-a-session
|
||||
(let [store (session-store "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Ring"}))))
|
||||
|
||||
|
||||
(deftest test-updating-a-session-given-db
|
||||
(let [db (monger.core/get-db)
|
||||
store (session-store db "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Ring"}))))
|
||||
|
||||
|
||||
(deftest test-deleting-a-session
|
||||
(let [store (session-store "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk)))))
|
||||
|
||||
|
||||
(deftest test-deleting-a-session
|
||||
(let [db (monger.core/get-db)
|
||||
store (session-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk)))))
|
||||
|
|
|
|||
|
|
@ -1,54 +1,100 @@
|
|||
(ns monger.test.ring.session-store-test
|
||||
(:require [monger.core :as mg]
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[ring.middleware.session.store :refer :all]
|
||||
[monger.ring.session-store :refer :all]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
ring.middleware.session.store
|
||||
monger.ring.session-store))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-sessions
|
||||
[f]
|
||||
(mc/remove db "sessions")
|
||||
(f)
|
||||
(mc/remove db "sessions"))
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-sessions)
|
||||
(defn purge-sessions
|
||||
[f]
|
||||
(mc/remove "web_sessions")
|
||||
(mc/remove "sessions")
|
||||
(f)
|
||||
(mc/remove "web_sessions")
|
||||
(mc/remove "sessions"))
|
||||
|
||||
(deftest test-reading-a-session-that-does-not-exist
|
||||
(let [store (monger-store db "sessions")]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
(use-fixtures :each purge-sessions)
|
||||
|
||||
(deftest test-reading-a-session-that-does-exist
|
||||
(let [store (monger-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Monger"}))))
|
||||
|
||||
(deftest test-updating-a-session
|
||||
(let [store (monger-store db "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Ring"}))))
|
||||
(deftest test-reading-a-session-that-does-not-exist
|
||||
(let [store (monger-store)]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
|
||||
(deftest test-deleting-a-session
|
||||
(let [store (monger-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk)))))
|
||||
|
||||
(deftest test-reader-extensions
|
||||
(let [d (java.util.Date.)
|
||||
oid (org.bson.types.ObjectId.)]
|
||||
(binding [*print-dup* true]
|
||||
(pr-str d)
|
||||
(pr-str oid)))))
|
||||
(deftest test-reading-a-session-that-does-not-exist-given-db
|
||||
(let [db (monger.core/get-db)
|
||||
store (monger-store db "sessions")]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
|
||||
|
||||
(deftest test-reading-a-session-that-does-exist
|
||||
(let [store (monger-store)
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Monger"}))))
|
||||
|
||||
|
||||
(deftest test-reading-a-session-that-does-exist-given-db
|
||||
(let [db (monger.core/get-db)
|
||||
store (monger-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Monger"}))))
|
||||
|
||||
|
||||
(deftest test-updating-a-session
|
||||
(let [store (monger-store "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Ring"}))))
|
||||
|
||||
|
||||
(deftest test-updating-a-session-given-db
|
||||
(let [db (monger.core/get-db)
|
||||
store (monger-store db "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Ring"}))))
|
||||
|
||||
|
||||
(deftest test-deleting-a-session
|
||||
(let [store (monger-store "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk)))))
|
||||
|
||||
|
||||
(deftest test-deleting-a-session-given-db
|
||||
(let [db (monger.core/get-db)
|
||||
store (monger-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk)))))
|
||||
|
||||
|
||||
(deftest test-reader-extensions
|
||||
(let [d (java.util.Date.)
|
||||
oid (org.bson.types.ObjectId.)]
|
||||
(binding [*print-dup* true]
|
||||
(pr-str d)
|
||||
(pr-str oid))))
|
||||
|
|
|
|||
|
|
@ -1,40 +1,42 @@
|
|||
(ns monger.test.stress-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.conversion :refer [to-db-object from-db-object]]
|
||||
[clojure.test :refer :all])
|
||||
(:import [com.mongodb WriteConcern]
|
||||
java.util.Date))
|
||||
(:import [com.mongodb Mongo DB DBCollection WriteResult DBObject WriteConcern DBCursor]
|
||||
java.util.Date)
|
||||
(:require monger.core
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collection
|
||||
[coll f]
|
||||
(mc/remove db coll)
|
||||
(f)
|
||||
(mc/remove db coll))
|
||||
;;
|
||||
;; Fixture functions
|
||||
;;
|
||||
|
||||
(defn purge-things-collection
|
||||
[f]
|
||||
(purge-collection "things" f))
|
||||
(defn purge-collection
|
||||
[collection-name, f]
|
||||
(monger.collection/remove collection-name)
|
||||
(f)
|
||||
(monger.collection/remove collection-name))
|
||||
|
||||
(use-fixtures :each purge-things-collection)
|
||||
(defn purge-things-collection
|
||||
[f]
|
||||
(purge-collection "things" f))
|
||||
|
||||
(deftest ^{:performance true} insert-large-batches-of-documents-without-object-ids
|
||||
(doseq [n [10 100 1000 10000 20000]]
|
||||
(let [collection "things"
|
||||
docs (map (fn [i]
|
||||
(to-db-object { :title "Untitled" :created-at (Date.) :number i }))
|
||||
(take n (iterate inc 1)))]
|
||||
(mc/remove db collection)
|
||||
(println "Inserting " n " documents...")
|
||||
(time (mc/insert-batch db collection docs))
|
||||
(is (= n (mc/count db collection))))))
|
||||
(use-fixtures :each purge-things-collection)
|
||||
|
||||
(deftest ^{:performance true} convert-large-number-of-dbojects-to-maps
|
||||
(doseq [n [10 100 1000 20000 40000]]
|
||||
(let [docs (map (fn [i]
|
||||
(to-db-object {:title "Untitled" :created-at (Date.) :number i}))
|
||||
(take n (iterate inc 1)))]
|
||||
(time (doall (map (fn [x] (from-db-object x true)) docs)))))))
|
||||
|
||||
|
||||
;;
|
||||
;; Tests
|
||||
;;
|
||||
|
||||
(monger.core/set-default-write-concern! WriteConcern/NORMAL)
|
||||
|
||||
(deftest ^{:performance true} insert-large-batches-of-documents-without-object-ids
|
||||
(doseq [n [1000 10000 100000]]
|
||||
(let [collection "things"
|
||||
docs (map (fn [i]
|
||||
(monger.conversion/to-db-object { :title "Untitled" :created-at (Date.) :number i }))
|
||||
(take n (iterate inc 1)))]
|
||||
(monger.collection/remove collection)
|
||||
(println "Inserting " n " documents...")
|
||||
(time (monger.collection/insert-batch collection docs))
|
||||
(is (= n (monger.collection/count collection))))))
|
||||
|
|
|
|||
|
|
@ -1,169 +1,165 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.updating-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBObject]
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mc]
|
||||
[monger.util :as mu]
|
||||
[monger.result :as mr]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer [to-db-object]]))
|
||||
[monger.test.helper :as helper])
|
||||
(:use clojure.test
|
||||
monger.operators
|
||||
monger.test.fixtures
|
||||
[monger.conversion :only [to-db-object]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries"))
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/update db collection { :_id doc-id } { $set { :language "Erlang" } })
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert-using-update-by-id
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/update-by-id db collection doc-id { $set { :language "Erlang" } })
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest ^{:updating true} update-nested-document-fields-without-upsert-using-update-by-id
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
date (Date.)
|
||||
doc { :created-at date :data-store "MongoDB" :language { :primary "Clojure" } :_id doc-id }
|
||||
modified-doc { :created-at date :data-store "MongoDB" :language { :primary "Erlang" } :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/update-by-id db collection doc-id { $set { "language.primary" "Erlang" }})
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))))
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
(deftest ^{:updating true} update-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 3 (mc/count db collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count db collection { :language "Scala" })))
|
||||
(is (= 0 (mc/count db collection { :language "Python" })))
|
||||
(mc/update db collection { :language "Clojure" } { $set { :language "Python" } } {:multi true})
|
||||
(is (= 0 (mc/count db collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count db collection { :language "Scala" })))
|
||||
(is (= 3 (mc/count db collection { :language "Python" })))))
|
||||
;;
|
||||
;; update, save
|
||||
;;
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert collection doc)
|
||||
(is (= (doc (mc/find-by-id collection doc-id))))
|
||||
(mc/update collection { :_id doc-id } { :language "Erlang" })
|
||||
(is (= (modified-doc (mc/find-by-id collection doc-id))))))
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert-using-update-by-id
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert collection doc)
|
||||
(is (= (doc (mc/find-by-id collection doc-id))))
|
||||
(mc/update-by-id collection doc-id { :language "Erlang" })
|
||||
(is (= (modified-doc (mc/find-by-id collection doc-id))))))
|
||||
|
||||
(deftest ^{:updating true} update-nested-document-fields-without-upsert-using-update-by-id
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
date (Date.)
|
||||
doc { :created-at date :data-store "MongoDB" :language { :primary "Clojure" } :_id doc-id }
|
||||
modified-doc { :created-at date :data-store "MongoDB" :language { :primary "Erlang" } :_id doc-id }]
|
||||
(mc/insert collection doc)
|
||||
(is (= (doc (mc/find-by-id collection doc-id))))
|
||||
(mc/update-by-id collection doc-id { $set { "language.primary" "Erlang" }})
|
||||
(is (= (modified-doc (mc/find-by-id collection doc-id))))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} save-a-new-document
|
||||
(let [collection "people"
|
||||
document {:name "Joe" :age 30}]
|
||||
(is (mc/save db "people" document))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest ^{:updating true} save-and-return-a-new-document
|
||||
(let [collection "people"
|
||||
document {:name "Joe" :age 30}
|
||||
returned (mc/save-and-return db "people" document)]
|
||||
(is (:_id returned))
|
||||
(is (= document (dissoc returned :_id)))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
(deftest ^{:updating true} update-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert collection { :language "Clojure", :name "monger" })
|
||||
(mc/insert collection { :language "Clojure", :name "langohr" })
|
||||
(mc/insert collection { :language "Clojure", :name "incanter" })
|
||||
(mc/insert collection { :language "Scala", :name "akka" })
|
||||
(is (= 3 (mc/count collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count collection { :language "Scala" })))
|
||||
(is (= 0 (mc/count collection { :language "Python" })))
|
||||
(mc/update collection { :language "Clojure" } { $set { :language "Python" } } :multi true)
|
||||
(is (= 0 (mc/count collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count collection { :language "Scala" })))
|
||||
(is (= 3 (mc/count collection { :language "Python" })))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} save-a-new-basic-db-object
|
||||
(let [collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (mu/get-id doc)))
|
||||
(mc/save db "people" doc WriteConcern/SAFE)
|
||||
(is (not (nil? (mu/get-id doc))))))
|
||||
(deftest ^{:updating true} save-a-new-document
|
||||
(let [collection "people"
|
||||
document {:name "Joe" :age 30}]
|
||||
(is (mr/ok? (mc/save "people" document)))
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
(deftest ^{:updating true} save-and-return-a-new-document
|
||||
(let [collection "people"
|
||||
document {:name "Joe" :age 30}
|
||||
returned (mc/save-and-return "people" document)]
|
||||
(is (:_id returned))
|
||||
(is (= document (dissoc returned :_id)))
|
||||
(is (= 1 (mc/count collection)))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} save-a-new-basic-db-object
|
||||
(let [collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (monger.util/get-id doc)))
|
||||
(mc/save monger.core/*mongodb-database* "people" doc WriteConcern/SAFE)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save
|
||||
(let [collection "people"
|
||||
doc-id "people-1"
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mc/insert db collection document))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(mc/save db collection { :_id doc-id, :name "Alan", :age 40 })
|
||||
(is (= 1 (mc/count db collection { :name "Alan", :age 40 })))))
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save
|
||||
(let [collection "people"
|
||||
doc-id "people-1"
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mr/ok? (mc/insert "people" document)))
|
||||
(is (= 1 (mc/count collection)))
|
||||
(mc/save collection { :_id doc-id, :name "Alan", :age 40 })
|
||||
(is (= 1 (mc/count collection { :name "Alan", :age 40 })))))
|
||||
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save-and-return
|
||||
(let [collection "people"
|
||||
document (mc/insert-and-return db collection {:name "Joe" :age 30})
|
||||
doc-id (:_id document)
|
||||
updated (mc/save-and-return db collection {:_id doc-id :name "Alan" :age 40})]
|
||||
(is (= {:_id doc-id :name "Alan" :age 40} updated))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= 1 (mc/count db collection {:name "Alan" :age 40})))))
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save-and-return
|
||||
(let [collection "people"
|
||||
document (mc/insert-and-return "people" {:name "Joe" :age 30})
|
||||
doc-id (:_id document)
|
||||
updated (mc/save-and-return collection {:_id doc-id :name "Alan" :age 40})]
|
||||
(is (= {:_id doc-id :name "Alan" :age 40} updated))
|
||||
(is (= 1 (mc/count collection)))
|
||||
(is (= 1 (mc/count collection {:name "Alan" :age 40})))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} set-an-attribute-on-existing-document-using-update
|
||||
(let [collection "people"
|
||||
doc-id (mu/object-id)
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mc/insert db collection document))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= 0 (mc/count db collection { :has_kids true })))
|
||||
(mc/update db collection { :_id doc-id } { $set { :has_kids true } })
|
||||
(is (= 1 (mc/count db collection { :has_kids true })))))
|
||||
(deftest ^{:updating true} set-an-attribute-on-existing-document-using-update
|
||||
(let [collection "people"
|
||||
doc-id (monger.util/object-id)
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mr/ok? (mc/insert "people" document)))
|
||||
(is (= 1 (mc/count collection)))
|
||||
(is (= 0 (mc/count collection { :has_kids true })))
|
||||
(mc/update collection { :_id doc-id } { $set { :has_kids true } })
|
||||
(is (= 1 (mc/count collection { :has_kids true })))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} increment-multiple-fields-using-exists-operator-and-update
|
||||
(let [collection "matches"
|
||||
doc-id (mu/object-id)
|
||||
document { :_id doc-id :abc 0 :def 10 }]
|
||||
(mc/remove db collection)
|
||||
(is (mc/insert db collection document))
|
||||
(is (= 1 (mc/count db collection {:abc {$exists true} :def {$exists true}})))
|
||||
(mc/update db collection {:abc {$exists true} :def {$exists true}} {$inc {:abc 1 :def 0}})
|
||||
(is (= 1 (mc/count db collection { :abc 1 })))))
|
||||
(deftest ^{:updating true} increment-multiple-fields-using-exists-operator-and-update
|
||||
(let [collection "matches"
|
||||
doc-id (monger.util/object-id)
|
||||
document { :_id doc-id :abc 0 :def 10 }]
|
||||
(mc/remove collection)
|
||||
(is (mr/ok? (mc/insert collection document)))
|
||||
(is (= 1 (mc/count collection {:abc {$exists true} :def {$exists true}})))
|
||||
(mc/update collection {:abc {$exists true} :def {$exists true}} {$inc {:abc 1 :def 0}})
|
||||
(is (= 1 (mc/count collection { :abc 1 })))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:updating true} upsert-a-document-using-update
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(is (not (mr/updated-existing? (mc/update db collection { :language "Clojure" } doc {:upsert true}))))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (mr/updated-existing? (mc/update db collection { :language "Clojure" } modified-doc {:multi false :upsert true})))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/remove db collection)))
|
||||
(deftest ^{:updating true} upsert-a-document-using-update
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(is (not (mr/updated-existing? (mc/update collection { :language "Clojure" } doc :upsert true))))
|
||||
(is (= 1 (mc/count collection)))
|
||||
(is (mr/updated-existing? (mc/update collection { :language "Clojure" } modified-doc :multi false :upsert true)))
|
||||
(is (= 1 (mc/count collection)))
|
||||
(is (= (modified-doc (mc/find-by-id collection doc-id))))
|
||||
(mc/remove collection)))
|
||||
|
||||
(deftest ^{:updating true} upsert-a-document-using-upsert
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc {:created-at date :data-store "MongoDB" :language "Clojure" :_id doc-id}
|
||||
modified-doc {:created-at date :data-store "MongoDB" :language "Erlang" :_id doc-id}]
|
||||
(mc/remove db collection)
|
||||
(is (not (mr/updated-existing? (mc/upsert db collection {:language "Clojure"} doc))))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (mr/updated-existing? (mc/upsert db collection {:language "Clojure"} modified-doc {:multi false})))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/remove db collection))))
|
||||
(deftest ^{:updating true} upsert-a-document-using-upsert
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc {:created-at date :data-store "MongoDB" :language "Clojure" :_id doc-id}
|
||||
modified-doc {:created-at date :data-store "MongoDB" :language "Erlang" :_id doc-id}]
|
||||
(mc/remove collection)
|
||||
(is (not (mr/updated-existing? (mc/upsert collection {:language "Clojure"} doc))))
|
||||
(is (= 1 (mc/count collection)))
|
||||
(is (mr/updated-existing? (mc/upsert collection {:language "Clojure"} modified-doc :multi false)))
|
||||
(is (= 1 (mc/count collection)))
|
||||
(is (= (modified-doc (mc/find-by-id collection doc-id))))
|
||||
(mc/remove collection)))
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
(ns monger.test.util-test
|
||||
(:import com.mongodb.DBObject)
|
||||
(:require [monger util conversion]
|
||||
[clojure.test :refer :all]))
|
||||
(:require [monger util conversion])
|
||||
(:use clojure.test))
|
||||
|
||||
|
||||
(deftest get-object-id
|
||||
|
|
|
|||
|
|
@ -1,11 +0,0 @@
|
|||
<configuration>
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
<logger name="org.mongodb" level="WARN"/>
|
||||
<root level="DEBUG">
|
||||
<appender-ref ref="STDOUT"/>
|
||||
</root>
|
||||
</configuration>
|
||||
Loading…
Reference in a new issue