Compare commits
No commits in common. "master" and "factories" have entirely different histories.
98 changed files with 4301 additions and 7311 deletions
13
.gitignore
vendored
13
.gitignore
vendored
|
|
@ -1,16 +1,9 @@
|
|||
pom.xml*
|
||||
pom.xml
|
||||
*jar
|
||||
/lib/
|
||||
/classes/
|
||||
.lein-*
|
||||
.lein-failures
|
||||
.lein-deps-sum
|
||||
TAGS
|
||||
checkouts/*
|
||||
doc/*
|
||||
deploy.docs.sh
|
||||
target/*
|
||||
todo.org
|
||||
.nrepl-*
|
||||
.idea/
|
||||
*.iml
|
||||
/.clj-kondo/.cache
|
||||
/.lsp/.cache
|
||||
|
|
|
|||
22
.travis.yml
22
.travis.yml
|
|
@ -1,20 +1,4 @@
|
|||
language: clojure
|
||||
sudo: required
|
||||
lein: lein
|
||||
dist: xenial
|
||||
before_script:
|
||||
# Give MongoDB server some time to boot
|
||||
- sleep 15
|
||||
- mongod --version
|
||||
- ./bin/ci/before_script.sh
|
||||
script: lein do clean, javac, test
|
||||
jdk:
|
||||
- openjdk10
|
||||
- oraclejdk11
|
||||
- openjdk12
|
||||
services:
|
||||
- mongodb
|
||||
branches:
|
||||
only:
|
||||
- master
|
||||
- 3.5.x-stable
|
||||
before_install: lein plugin install lein-multi 1.1.0
|
||||
before_script: ./bin/ci/before_script.sh
|
||||
script: lein multi test
|
||||
|
|
|
|||
|
|
@ -1,13 +0,0 @@
|
|||
## Pre-requisites
|
||||
|
||||
The project uses [Leiningen 2](http://leiningen.org) and requires a recent MongoDB to be running
|
||||
locally. Make sure you have those two installed and then run tests against all supported Clojure versions using
|
||||
|
||||
./bin/ci/before_script.sh
|
||||
lein all do clean, javac, test
|
||||
|
||||
|
||||
## Pull Requests
|
||||
|
||||
Then create a branch and make your changes on it. Once you are done with your changes and all
|
||||
tests pass, write a [good, detailed commit message](http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html) submit a pull request on GitHub.
|
||||
1095
ChangeLog.md
1095
ChangeLog.md
File diff suppressed because it is too large
Load diff
384
README.md
384
README.md
|
|
@ -1,125 +1,366 @@
|
|||
# Monger, a modern Clojure MongoDB Driver
|
||||
[](https://travis-ci.org/xingzhefeng/monger)
|
||||
Monger is an idiomatic [Clojure MongoDB driver](http://clojuremongodb.info) for a more civilized age.
|
||||
# Monger
|
||||
|
||||
It has batteries included, offers powerful expressive query DSL,
|
||||
strives to support modern MongoDB features and have the "look and feel" and
|
||||
flexibility of the MongoDB shell.
|
||||
Monger is an idiomatic Clojure wrapper around MongoDB Java driver. It offers powerful expressive query DSL, strives to support
|
||||
every MongoDB 2.0+ feature and is well maintained.
|
||||
|
||||
Monger is built from for modern Clojure versions and sits on top of
|
||||
the official MongoDB Java driver.
|
||||
[](http://travis-ci.org/michaelklishin/monger)
|
||||
|
||||
|
||||
## Project Goals
|
||||
|
||||
There is one MongoDB client for Clojure that has been around since 2009. So, why create another one? Monger authors
|
||||
wanted a client that would
|
||||
wanted a client that will
|
||||
|
||||
* Support most of modern MongoDB features, focus on those that really matter.
|
||||
* Be [well documented](http://clojuremongodb.info).
|
||||
* Be [well tested](https://github.com/michaelklishin/monger/tree/master/test/monger/test).
|
||||
* Target modern Clojure versions.
|
||||
* Be as close to the Mongo shell query language as practical
|
||||
* Integrate with libraries like Joda Time, [Cheshire](https://github.com/dakrone/cheshire), clojure.data.json, [Ragtime](https://github.com/weavejester/ragtime).
|
||||
* Support URI connections to be friendly to Heroku and other PaaS providers.
|
||||
* Not carry technical debt from 2009 forever.
|
||||
* Support most of MongoDB 2.0+ features but only those that really matter. Grouping the way it is done today, for example, does not (it is easier to just use Map/Reduce directly).
|
||||
* Be well documented.
|
||||
* Be well tested.
|
||||
* Be maintained, do not carry technical debt from 2009 forever.
|
||||
* Integrate with libraries like clojure.data.json and Joda Time.
|
||||
* Integrate usage of JavaScript files and ClojureScript (as soon as the compiler gets artifact it is possible to depend on for easy embedding).
|
||||
* Learn from other clients like the Java and Ruby ones.
|
||||
* Target Clojure 1.3.0 and later from the ground up.
|
||||
|
||||
|
||||
## Documentation & Examples
|
||||
|
||||
## Project Maturity
|
||||
We are working on documentation guides & examples site for the 1.0 release. In the meantime, please refer to the [test suite](https://github.com/michaelklishin/monger/tree/master/test/monger/test) for code examples.
|
||||
|
||||
Monger is not a young project: started in July 2011, it is over 7
|
||||
years old with active production use from week 1.
|
||||
## Community
|
||||
|
||||
[Monger has a mailing list](https://groups.google.com/forum/#!forum/clojure-monger). Feel free to join it and ask any questions you may have.
|
||||
|
||||
To subscribe for announcements of releases, important changes and so on, please follow [@ClojureWerkz](https://twitter.com/#!/clojurewerkz) on Twitter.
|
||||
|
||||
|
||||
## This is a Work In Progress
|
||||
|
||||
Core Monger APIs are stabilized but it is still a work in progress. Keep that in mind. 1.0 will be released in 2012
|
||||
together with documentation guides and dedicated website.
|
||||
|
||||
|
||||
## Artifacts
|
||||
|
||||
Monger artifacts are [released to
|
||||
Clojars](https://clojars.org/com.novemberain/monger). If you are using
|
||||
Maven, add the following repository definition to your `pom.xml`:
|
||||
|
||||
``` xml
|
||||
<repository>
|
||||
<id>clojars.org</id>
|
||||
<url>http://clojars.org/repo</url>
|
||||
</repository>
|
||||
```
|
||||
|
||||
### The Most Recent Release
|
||||
|
||||
With Leiningen:
|
||||
|
||||
[com.novemberain/monger "3.5.0"]
|
||||
[com.novemberain/monger "1.0.0-beta2"]
|
||||
|
||||
|
||||
With Maven:
|
||||
|
||||
<dependency>
|
||||
<groupId>com.novemberain</groupId>
|
||||
<artifactId>monger</artifactId>
|
||||
<version>3.5.0</version>
|
||||
<version>1.0.0-beta2</version>
|
||||
</dependency>
|
||||
|
||||
|
||||
### Snapshots
|
||||
|
||||
## Getting Started
|
||||
If you are comfortable with using snapshots, snapshot artifacts are [released to Clojars](https://clojars.org/com.novemberain/monger) every 24 hours.
|
||||
|
||||
Please refer to our [Getting Started
|
||||
guide](http://clojuremongodb.info/articles/getting_started.html). Don't
|
||||
hesitate to join our [mailing
|
||||
list](https://groups.google.com/forum/#!forum/clojure-mongodb) and ask
|
||||
questions, too!
|
||||
With Leiningen:
|
||||
|
||||
[com.novemberain/monger "1.0.0-SNAPSHOT"]
|
||||
|
||||
|
||||
## Documentation & Examples
|
||||
With Maven:
|
||||
|
||||
Please see our [documentation guides site](http://clojuremongodb.info/) and [API reference](http://reference.clojuremongodb.info).
|
||||
|
||||
Our [test suite](https://github.com/michaelklishin/monger/tree/master/test/monger/test)
|
||||
also has many code examples.
|
||||
|
||||
|
||||
## Community
|
||||
|
||||
[Monger has a mailing list](https://groups.google.com/forum/#!forum/clojure-mongodb). Feel
|
||||
free to join it and ask any questions you may have.
|
||||
|
||||
To subscribe for announcements of releases, important changes and so
|
||||
on, please follow [@ClojureWerkz](https://twitter.com/#!/clojurewerkz)
|
||||
on Twitter.
|
||||
<dependency>
|
||||
<groupId>com.novemberain</groupId>
|
||||
<artifactId>monger</artifactId>
|
||||
<version>1.0.0-SNAPSHOT</version>
|
||||
</dependency>
|
||||
|
||||
|
||||
## Supported Clojure versions
|
||||
|
||||
Monger requires Clojure 1.8+. The most recent
|
||||
stable release is highly recommended.
|
||||
Monger is built from the ground up for Clojure 1.3 and up.
|
||||
|
||||
|
||||
## Continuous Integration Status
|
||||
## Connecting to MongoDB
|
||||
|
||||
[](http://travis-ci.org/michaelklishin/monger)
|
||||
Monger supports working with multiple connections and/or databases but is optimized for applications that only use one connection
|
||||
and one database.
|
||||
|
||||
``` clojure
|
||||
(ns my.service.server
|
||||
(:require [monger core util]))
|
||||
|
||||
;; localhost, default port
|
||||
(monger.core/connect!)
|
||||
|
||||
;; given host, given port
|
||||
(monger.core/connect! { :host "db.megacorp.internal" :port 7878 })
|
||||
```
|
||||
|
||||
To set default database Monger will use, use `monger.core/get-db` and `monger.core/set-db!` functions in combination:
|
||||
|
||||
``` clojure
|
||||
(ns my.service.server
|
||||
(:require [monger core]]))
|
||||
|
||||
;; localhost, default port
|
||||
(monger.core/connect!)
|
||||
(monger.core/set-db! (monger.core/get-db "monger-test"))
|
||||
```
|
||||
|
||||
To set default write concern, use `monger.core/set-default-write-concern!` function:
|
||||
|
||||
``` clojure
|
||||
(monger.core/set-default-write-concern! WriteConcern/FSYNC_SAFE)
|
||||
```
|
||||
|
||||
By default Monger will use `WriteConcern/SAFE` as write concern. We believe that MongoDB Java driver (as well as other
|
||||
official drivers) are using very unsafe defaults when no exceptions are raised, even for network issues. This does not sound
|
||||
like a good default for most applications: many applications use MongoDB because of the flexibility, not extreme write throughput
|
||||
requirements.
|
||||
|
||||
## Inserting Documents
|
||||
|
||||
To insert documents, use `monger.collection/insert` and `monger.collection/insert-batch` functions.
|
||||
|
||||
``` clojure
|
||||
(ns my.service.server
|
||||
(:use [monger.core :only [connect! connect set-db! get-db]]
|
||||
[monger.collection :only [insert insert-batch]])
|
||||
(:import [org.bson.types ObjectId]
|
||||
[com.mongodb DB WriteConcern]))
|
||||
|
||||
;; localhost, default port
|
||||
(connect!)
|
||||
(set-db! (monger.core/get-db "monger-test"))
|
||||
|
||||
;; without document id
|
||||
(insert "document" { :first_name "John" :last_name "Lennon" })
|
||||
|
||||
;; multiple documents at once
|
||||
(insert-batch "document" [{ :first_name "John" :last_name "Lennon" }
|
||||
{ :first_name "Paul" :last_name "McCartney" }])
|
||||
|
||||
;; with explicit document id
|
||||
(insert "documents" { :_id (ObjectId.) :first_name "John" :last_name "Lennon" })
|
||||
|
||||
;; with a different write concern
|
||||
(insert "documents" { :_id (ObjectId.) :first_name "John" :last_name "Lennon" } WriteConcern/JOURNAL_SAFE)
|
||||
|
||||
;; with a different database
|
||||
(let [archive-db (get-db "monger-test.archive")]
|
||||
(insert archive-db "documents" { :first_name "John" :last_name "Lennon" } WriteConcern/NORMAL))
|
||||
```
|
||||
|
||||
### Write Performance
|
||||
|
||||
Monger insert operations are efficient and have very little overhead compared to the underlying Java driver. Here
|
||||
are some numbers on a MacBook Pro from fall 2010 with Core i7 and an Intel SSD drive:
|
||||
|
||||
```
|
||||
Testing monger.test.stress
|
||||
Inserting 1000 documents...
|
||||
"Elapsed time: 38.317 msecs"
|
||||
Inserting 10,000 documents...
|
||||
"Elapsed time: 263.827 msecs"
|
||||
Inserting 100,000 documents...
|
||||
"Elapsed time: 1679.828 msecs"
|
||||
```
|
||||
|
||||
With the `SAFE` write concern, it takes roughly 1.7 second to insert 100,000 documents.
|
||||
|
||||
|
||||
## Regular Finders
|
||||
|
||||
`monger.collection` namespace provides several finder functions that try to follow MongoDB query language as closely as possible,
|
||||
even when providing shortcuts for common cases.
|
||||
|
||||
``` clojure
|
||||
(ns my.service.finders
|
||||
(:require [monger.collection :as mc])
|
||||
(:use [monger.operators]))
|
||||
|
||||
;; find one document by id, as Clojure map
|
||||
(mc/find-map-by-id "documents" (ObjectId. "4ec2d1a6b55634a935ea4ac8"))
|
||||
|
||||
;; find one document by id, as `com.mongodb.DBObject` instance
|
||||
(mc/find-by-id "documents" (ObjectId. "4ec2d1a6b55634a935ea4ac8"))
|
||||
|
||||
;; find one document as Clojure map
|
||||
(mc/find-one-as-map "documents" { :_id (ObjectId. "4ec2d1a6b55634a935ea4ac8") })
|
||||
|
||||
;; find one document by id, as `com.mongodb.DBObject` instance
|
||||
(mc/find-one "documents" { :_id (ObjectId. "4ec2d1a6b55634a935ea4ac8") })
|
||||
|
||||
|
||||
;; all documents as Clojure maps
|
||||
(mc/find-maps "documents")
|
||||
|
||||
;; all documents as `com.mongodb.DBObject` instances
|
||||
(mc/find "documents")
|
||||
|
||||
;; with a query, as Clojure maps
|
||||
(mc/find-maps "documents" { :year 1998 })
|
||||
|
||||
;; with a query, as `com.mongodb.DBObject` instances
|
||||
(mc/find "documents" { :year 1998 })
|
||||
|
||||
;; with a query that uses operators
|
||||
(mc/find "products" { :price_in_subunits { $gt 4000 $lte 1200 } })
|
||||
|
||||
;; with a query that uses operators as strings
|
||||
(mc/find "products" { :price_in_subunits { "$gt" 4000 "$lte" 1200 } })
|
||||
```
|
||||
|
||||
|
||||
## Powerful Query DSL
|
||||
|
||||
Every application that works with data stores has to query them. As a consequence, having an expressive powerful query DSL is a must
|
||||
for client libraries like Monger.
|
||||
|
||||
Here is what monger.query DSL feels like:
|
||||
|
||||
``` clojure
|
||||
(with-collection "movies"
|
||||
(find { :year { $lt 2010 $gte 2000 }, :revenue { $gt 20000000 } })
|
||||
(fields [ :year :title :producer :cast :budget :revenue ])
|
||||
(sort-by { :revenue -1 })
|
||||
(skip 10)
|
||||
(limit 20)
|
||||
(hint "year-by-year-revenue-idx")
|
||||
(snapshot))
|
||||
```
|
||||
|
||||
It is easy to add new DSL elements, for example, adding pagination took literally less than 10 lines of Clojure code. Here is what
|
||||
it looks like:
|
||||
|
||||
``` clojure
|
||||
(with-collection coll
|
||||
(find {})
|
||||
(paginate :page 1 :per-page 3)
|
||||
(sort { :title 1 })
|
||||
(read-preference ReadPreference/PRIMARY))
|
||||
```
|
||||
|
||||
Query DSL supports composition, too:
|
||||
|
||||
``` clojure
|
||||
(let
|
||||
[top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
result (with-collection coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc))]
|
||||
;; ...
|
||||
)
|
||||
```
|
||||
|
||||
More code examples can be found [in our test suite](https://github.com/michaelklishin/monger/tree/master/test/monger/test).
|
||||
|
||||
|
||||
## Updating Documents
|
||||
|
||||
Use `monger.collection/update` and `monger.collection/save`.
|
||||
|
||||
|
||||
## Removing Documents
|
||||
|
||||
Use `monger.collection/remove`.
|
||||
|
||||
|
||||
## Counting Documents
|
||||
|
||||
Use `monger.collection/count`, `monger.collection/empty?` and `monger.collection/any?`.
|
||||
|
||||
|
||||
## Determening Whether Operation Succeeded (or Failed)
|
||||
|
||||
To be documented.
|
||||
|
||||
|
||||
## Validators with Validateur
|
||||
|
||||
Monger relies on [Validateur](http://github.com/michaelklishin/validateur) for data validation.
|
||||
|
||||
To be documented.
|
||||
|
||||
|
||||
## Integration With Popular Libraries
|
||||
|
||||
Because Monger was built for Clojure 1.3 and later, it can take advantage of relatively new powerful Clojure features such as protocols.
|
||||
|
||||
|
||||
### Integration with clojure.data.json
|
||||
|
||||
Monger was created for AMQP and HTTP services that use JSON to serialize message payloads. When serializing documents to JSON, developers
|
||||
usually want to represent `com.mongodb.ObjectId` instances as strings in resulting JSON documents. Monger integrates with [clojure.data.json](http://github.com/clojure/data.json) to
|
||||
make that effortless.
|
||||
|
||||
Just load `monger.json` namespace and it will extend `clojure.data.json/WriteJSON` protocol to support `com.mongodb.ObjectId` instance. Then
|
||||
functions like `clojure.data.json/write-json` will be able to serialize object ids as strings exactly the way you expect it to be.
|
||||
|
||||
``` clojure
|
||||
(ns my.service.handlers
|
||||
;; Make clojure.data.json aware of ObjectId instances
|
||||
(:require [monger.json]))
|
||||
```
|
||||
|
||||
|
||||
### Integration with Joda Time
|
||||
|
||||
Monger provides the `monger.joda-time` namespace that extend its own Clojure-to-DBObject conversion protocols as well as
|
||||
[clojure.data.json](http://github.com/clojure/data.json) `WriteJSON` protocol to handle `org.joda.time.DateTime` instances. To use it, make sure that
|
||||
you have JodaTime and clojure.data.json on your dependencies list then load `monger.joda-time` like so
|
||||
|
||||
``` clojure
|
||||
(ns my.service.handlers
|
||||
;; Make Monger conversion protocols and clojure.data.json aware of JodaTime's DateTime instances
|
||||
(:require [monger.joda-time]))
|
||||
```
|
||||
|
||||
Now `clojure.data.json/write-json` and related functions will serialize JodaTime date time objects using [ISO8601 date time format](http://joda-time.sourceforge.net/apidocs/org/joda/time/format/ISODateTimeFormat.html). In addition, functions that convert MongoDB documents to
|
||||
Clojure maps will instantiate JodaTime date time objects from `java.util.Date` instances MongoDB Java driver uses.
|
||||
|
||||
|
||||
## Map/Reduce. Using JavaScript Resources.
|
||||
|
||||
To be documented.
|
||||
|
||||
|
||||
## Operations On Indexes
|
||||
|
||||
To be documented.
|
||||
|
||||
|
||||
## Database Commands
|
||||
|
||||
To be documented.
|
||||
|
||||
|
||||
## GridFS Support
|
||||
|
||||
To be documented.
|
||||
|
||||
|
||||
## Helper Functions
|
||||
|
||||
To be documented.
|
||||
|
||||
|
||||
## Monger Is a ClojureWerkz Project
|
||||
|
||||
Monger is part of the [group of Clojure libraries known as ClojureWerkz](http://clojurewerkz.org), together with
|
||||
[Cassaforte](http://clojurecassandra.info), [Langohr](http://clojurerabbitmq.info), [Elastisch](http://clojureelasticsearch.info), [Quartzite](http://clojurequartz.info) and several others.
|
||||
|
||||
Neocons is part of the group of libraries known as ClojureWerkz, together with
|
||||
[Neocons](https://github.com/michaelklishin/neocons), [Langohr](https://github.com/michaelklishin/langohr), [Elastisch](https://github.com/clojurewerkz/elastisch), [Quartzite](https://github.com/michaelklishin/quartzite) and several others.
|
||||
|
||||
|
||||
## Development
|
||||
|
||||
Monger uses [Leiningen 2](https://github.com/technomancy/leiningen/blob/master/doc/TUTORIAL.md). Make sure you have it installed and then run tests against
|
||||
supported Clojure versions using
|
||||
Install [lein-multi](https://github.com/maravillas/lein-multi) with
|
||||
|
||||
./bin/ci/before_script.sh
|
||||
lein all do clean, javac, test
|
||||
lein plugin install lein-multi 1.1.0
|
||||
|
||||
Or, if you don't have mongodb installed, you can use docker
|
||||
then run tests against Clojure 1.3.0 and 1.4.0[-beta1] using
|
||||
|
||||
docker-compose up
|
||||
./bin/ci/before_script_docker.sh
|
||||
lein all do clean, javac, test
|
||||
lein multi test
|
||||
|
||||
Then create a branch and make your changes on it. Once you are done with your changes and all tests pass, submit a pull request
|
||||
on Github.
|
||||
|
|
@ -128,7 +369,6 @@ on Github.
|
|||
|
||||
## License
|
||||
|
||||
Copyright (C) 2011-2018 [Michael S. Klishin](http://twitter.com/michaelklishin), Alex Petrov, and the ClojureWerkz team.
|
||||
Copyright (C) 2011-2012 Michael S. Klishin
|
||||
|
||||
Double licensed under the [Eclipse Public License](http://www.eclipse.org/legal/epl-v10.html) (the same as Clojure) or
|
||||
the [Apache Public License 2.0](http://www.apache.org/licenses/LICENSE-2.0.html).
|
||||
Distributed under the [Eclipse Public License](http://www.eclipse.org/legal/epl-v10.html), the same as Clojure.
|
||||
|
|
|
|||
|
|
@ -1,18 +1,8 @@
|
|||
#!/bin/sh
|
||||
|
||||
# Check which MongoDB shell is available
|
||||
if command -v mongosh >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongosh"
|
||||
elif command -v mongo >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongo"
|
||||
else
|
||||
echo "Error: Neither mongo nor mongosh shell found. Please install MongoDB shell."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# MongoDB Java driver won't run authentication twice on the same DB instance,
|
||||
# so we need to use multiple DBs.
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test2
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test3
|
||||
$MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test4
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test2
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test3
|
||||
mongo --eval 'db.addUser("clojurewerkz/monger", "monger")' monger-test4
|
||||
|
|
|
|||
|
|
@ -1,18 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
# Check which MongoDB shell is available in the container
|
||||
if docker exec mongo_test which mongosh >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongosh"
|
||||
elif docker exec mongo_test which mongo >/dev/null 2>&1; then
|
||||
MONGO_SHELL="mongo"
|
||||
else
|
||||
echo "Error: Neither mongo nor mongosh shell found in the container."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# MongoDB Java driver won't run authentication twice on the same DB instance,
|
||||
# so we need to use multiple DBs.
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test2
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test3
|
||||
docker exec mongo_test $MONGO_SHELL --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], mechanisms: ["SCRAM-SHA-1"], passwordDigestor: "client"})' monger-test4
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
# MongoDB seems to need some time to boot first. MK.
|
||||
sleep 5
|
||||
|
||||
# MongoDB Java driver won't run authentication twice on the same DB instance,
|
||||
# so we need to use multiple DBs.
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test2
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test3
|
||||
mongo --eval 'db.createUser({"user": "clojurewerkz/monger", "pwd": "monger", roles: ["dbAdmin"], passwordDigestor: "client"})' monger-test4
|
||||
|
|
@ -1,8 +0,0 @@
|
|||
#!/bin/sh
|
||||
|
||||
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4
|
||||
|
||||
echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu xenial/mongodb-org/4.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-4.0.list
|
||||
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y mongodb-org
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
# Use root/example as user/password credentials
|
||||
version: '3.1'
|
||||
|
||||
services:
|
||||
|
||||
mongo:
|
||||
image: mongo
|
||||
container_name: mongo_test
|
||||
restart: always
|
||||
ports:
|
||||
- "27017:27017"
|
||||
82
examples/basic_operations.clj
Normal file
82
examples/basic_operations.clj
Normal file
|
|
@ -0,0 +1,82 @@
|
|||
(ns examples.basic_operations
|
||||
(:gen-class)
|
||||
(:require [monger core collection util])
|
||||
(:import (com.mongodb Mongo DB DBObject))
|
||||
(:use [clojure.tools.cli]))
|
||||
|
||||
(defn fix-paul-maccartneys-name
|
||||
"Fix Paul McCartney's name"
|
||||
[]
|
||||
(let [ paul_mccartney (monger.collection/find-one "people" { :first_name "Raul" }) ]
|
||||
|
||||
|
||||
))
|
||||
|
||||
(do
|
||||
(let [
|
||||
args *command-line-args*
|
||||
parsed-args (cli args
|
||||
["--port" "Mongodb port" :default 27017]
|
||||
["--host" "Mongodb host" :default "127.0.0.1"]
|
||||
["--db-name" :default "monger-example"]) ]
|
||||
|
||||
(monger.core/connect! (first parsed-args))
|
||||
(monger.core/set-db! (monger.core/get-db "monger-example"))
|
||||
|
||||
(println "Does people connection exist: " (monger.collection/exists? "people"))
|
||||
|
||||
;; Insert a record to people collection
|
||||
(monger.collection/insert "people" { :first_name "John" :last_name "Lennon" })
|
||||
|
||||
;; Count an amount of records just inserted
|
||||
(println "People collection is: " (monger.collection/count "people"))
|
||||
|
||||
;; Insert several records
|
||||
(monger.collection/insert-batch "people" [{ :first_name "Ringo" :last_name "Starr" }
|
||||
{ :first_name "Raul" :last_name "McCartney" }
|
||||
{ :first_name "George" :last_name "Harrison" } ])
|
||||
|
||||
(println "People collection is: " (monger.collection/count "people"))
|
||||
|
||||
;; Fix a typo in the inserted record
|
||||
(monger.collection/update "people" { :first_name "Raul" } { "$set" { :first_name "Paul" } })
|
||||
|
||||
(println (monger.collection/find-one-as-map "people" { :first_name "Paul" }))
|
||||
|
||||
;; Now, let's add the index to that record
|
||||
(monger.collection/update "people" { :first_name "Paul" } { "$set" { :years_on_stage 1 } })
|
||||
|
||||
(println (monger.collection/find-one-as-map "people" { :first_name "Paul" }))
|
||||
|
||||
;; Increment record 45 times
|
||||
(dotimes [n 45]
|
||||
(monger.collection/update "people" { :first_name "Paul" } { "$inc" { :years_on_stage 1 } })
|
||||
(println (monger.collection/find-one-as-map "people" { :first_name "Paul" }))
|
||||
)
|
||||
|
||||
;; Remove years_on_stage field
|
||||
(monger.collection/update "people" { :first_name "Paul" } { "$unset" { :years_on_stage 1} })
|
||||
|
||||
;; Insert the record to the data set if it wasn't there yet
|
||||
(monger.collection/update "people" { :first_name "Yoko" } { :first_name "Yoko" :last_name "Ono" } :upsert true)
|
||||
|
||||
;; Update multiple records
|
||||
(monger.collection/update "people" { } { "$set" { :band "The Beatles" }} :multi true)
|
||||
|
||||
;; Save can act both like insert and update
|
||||
(def ian_gillian
|
||||
(monger.conversion/to-db-object
|
||||
{ :first_name "Ian" :last_name "Gillan" }))
|
||||
|
||||
;; Performs insert
|
||||
(monger.collection/save "people" ian_gillian)
|
||||
|
||||
;; Performs update
|
||||
(monger.collection/save "people"
|
||||
{ :_id (monger.util/get-id ian_gillian)
|
||||
:first_name "Ian"
|
||||
:last_name "Gillan" :band "Deep Purple" })
|
||||
|
||||
;; Remove people collection
|
||||
(monger.collection/drop "people")
|
||||
))
|
||||
83
project.clj
83
project.clj
|
|
@ -1,58 +1,27 @@
|
|||
(defproject com.novemberain/monger "4.0.0-SNAPSHOT"
|
||||
:description "Monger is a Clojure MongoDB client for a more civilized age: friendly, flexible and with batteries included"
|
||||
:url "http://clojuremongodb.info"
|
||||
:min-lein-version "2.5.1"
|
||||
:license {:name "Eclipse Public License"
|
||||
:url "http://www.eclipse.org/legal/epl-v10.html"}
|
||||
:dependencies [[org.clojure/clojure "1.11.1"]
|
||||
[org.mongodb/mongodb-driver "3.12.11"]
|
||||
[clojurewerkz/support "1.5.0"]]
|
||||
:test-selectors {:default (fn [m]
|
||||
(and (not (:performance m))
|
||||
(not (:edge-features m))
|
||||
(not (:time-consuming m))))
|
||||
:focus :focus
|
||||
:authentication :authentication
|
||||
:updating :updating
|
||||
:indexing :indexing
|
||||
:external :external
|
||||
:cache :cache
|
||||
:gridfs :gridfs
|
||||
:command :command
|
||||
:integration :integration
|
||||
:performance :performance
|
||||
;; as in, edge mongodb server
|
||||
:edge-features :edge-features
|
||||
:time-consuming :time-consuming
|
||||
:all (constantly true)}
|
||||
:source-paths ["src/clojure"]
|
||||
:java-source-paths ["src/java"]
|
||||
:javac-options ["-target" "1.8" "-source" "1.8"]
|
||||
:mailing-list {:name "clojure-mongodb"
|
||||
:archive "https://groups.google.com/group/clojure-mongodb"
|
||||
:post "clojure-mongodb@googlegroups.com"}
|
||||
:profiles {:1.10 {:dependencies [[org.clojure/clojure "1.10.2"]]}
|
||||
:1.9 {:dependencies [[org.clojure/clojure "1.9.0"]]}
|
||||
:dev {:resource-paths ["test/resources"]
|
||||
:dependencies [[clj-time "0.15.1" :exclusions [org.clojure/clojure]]
|
||||
[cheshire "5.8.1" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/data.json "2.5.0" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/tools.cli "0.4.1" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/core.cache "0.7.1" :exclusions [org.clojure/clojure]]
|
||||
[ring/ring-core "1.7.1" :exclusions [org.clojure/clojure]]
|
||||
[com.novemberain/validateur "2.6.0" :exclusions [org.clojure/clojure]]
|
||||
[ch.qos.logback/logback-classic "1.2.3" :exclusions [org.slf4j/slf4j-api]]
|
||||
[ragtime/core "0.7.2" :exclusions [org.clojure/clojure]]]
|
||||
:plugins [[lein-codox "0.10.5"]]
|
||||
:codox {:source-paths ["src/clojure"]
|
||||
:namespaces [#"^monger\.(?!internal)"]}}
|
||||
;; only clj-time/JodaTime available, used to test monger.joda-time w/o clojure.data.json
|
||||
:dev2 {:resource-paths ["test/resources"]
|
||||
:dependencies [[clj-time "0.15.2" :exclusions [org.clojure/clojure]]]}}
|
||||
:aliases {"all" ["with-profile" "dev:dev,1.10:dev,1.9:dev"]}
|
||||
:repositories {"sonatype" {:url "https://oss.sonatype.org/content/repositories/releases"
|
||||
(defproject com.novemberain/monger "1.0.0-SNAPSHOT"
|
||||
:description "Monger is an experimental idiomatic Clojure wrapper around MongoDB Java driver"
|
||||
:license { :name "Eclipse Public License" }
|
||||
:mailing-list {:name "clojure-monger"
|
||||
:archive "https://groups.google.com/group/clojure-monger"
|
||||
:post "clojure-monger@googlegroups.com"}
|
||||
:repositories { "sonatype"
|
||||
{:url "http://oss.sonatype.org/content/repositories/releases"
|
||||
:snapshots false
|
||||
:releases {:checksum :fail :update :always}}
|
||||
"sonatype-snapshots" {:url "https://oss.sonatype.org/content/repositories/snapshots"
|
||||
:snapshots true
|
||||
:releases {:checksum :fail :update :always}}})
|
||||
:releases {:checksum :fail :update :always}
|
||||
}}
|
||||
:dependencies [[org.clojure/clojure "1.3.0"]
|
||||
[org.mongodb/mongo-java-driver "2.7.3"]
|
||||
[com.novemberain/validateur "1.0.0"]]
|
||||
:multi-deps {
|
||||
"1.4" [[org.clojure/clojure "1.4.0-beta1"]]
|
||||
:all [[org.mongodb/mongo-java-driver "2.7.3"]
|
||||
[com.novemberain/validateur "1.0.0"]]
|
||||
}
|
||||
:dev-dependencies [[org.clojure/data.json "0.1.2" :exclusions [org.clojure/clojure]]
|
||||
[clj-time "0.3.6" :exclusions [org.clojure/clojure]]
|
||||
[codox "0.3.4" :exclusions [org.clojure/clojure]]
|
||||
[org.clojure/tools.cli "0.2.1" :exclusions [org.clojure/clojure]]]
|
||||
:dev-resources-path "test/resources"
|
||||
:warn-on-reflection true
|
||||
:codox { :exclude [monger.internal.pagination] }
|
||||
:test-selectors {:focus (fn [v] (:focus v))})
|
||||
|
|
|
|||
|
|
@ -1,85 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.cache
|
||||
"clojure.core.cache implementation(s) on top of MongoDB.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/integration.html"
|
||||
(:require [monger.collection :as mc :refer [find-one find-by-id find-map-by-id]]
|
||||
[clojure.core.cache :as cache]
|
||||
[monger.conversion :as cnv])
|
||||
(:import clojure.core.cache.CacheProtocol
|
||||
[com.mongodb DB DBObject WriteConcern]
|
||||
java.util.Map))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(def ^{:const true}
|
||||
default-cache-collection "cache_entries")
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defrecord BasicMongerCache [db collection])
|
||||
|
||||
(extend-protocol cache/CacheProtocol
|
||||
BasicMongerCache
|
||||
(lookup [c k]
|
||||
(let [m (mc/find-map-by-id (:db c) (:collection c) k)]
|
||||
(:value m)))
|
||||
(has? [c k]
|
||||
(not (nil? (mc/find-by-id (:db c) (:collection c) k))))
|
||||
(hit [this k]
|
||||
this)
|
||||
(miss [c k v]
|
||||
(mc/insert (:db c) (:collection c) {:_id k :value v})
|
||||
c)
|
||||
(evict [c k]
|
||||
(mc/remove-by-id (:db c) (:collection c) k)
|
||||
c)
|
||||
(seed [c m]
|
||||
(mc/insert-batch (:db c) (:collection c) (map (fn [[k v]]
|
||||
{:_id k :value v}) m))
|
||||
c))
|
||||
|
||||
|
||||
(defn basic-monger-cache-factory
|
||||
([^DB db]
|
||||
(BasicMongerCache. db default-cache-collection))
|
||||
([^DB db collection]
|
||||
(BasicMongerCache. db collection))
|
||||
([^DB db collection base]
|
||||
(cache/seed (BasicMongerCache. db collection) base)))
|
||||
|
|
@ -1,577 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; Copyright (c) 2012 Baishampayan Ghose
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; Copyright (c) 2012 Baishampayan Ghose
|
||||
;;
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.collection
|
||||
"Provides key functionality for interaction with MongoDB: inserting, querying, updating and deleting documents, performing Aggregation Framework
|
||||
queries, creating and dropping indexes, creating collections and more.
|
||||
|
||||
For more advanced read queries, see monger.query.
|
||||
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/getting_started.html
|
||||
* http://clojuremongodb.info/articles/inserting.html
|
||||
* http://clojuremongodb.info/articles/querying.html
|
||||
* http://clojuremongodb.info/articles/updating.html
|
||||
* http://clojuremongodb.info/articles/deleting.html
|
||||
* http://clojuremongodb.info/articles/aggregation.html"
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty? any? update])
|
||||
(:import [com.mongodb Mongo DB DBCollection WriteResult DBObject WriteConcern
|
||||
DBCursor MapReduceCommand MapReduceCommand$OutputType AggregationOutput
|
||||
AggregationOptions AggregationOptions$OutputMode]
|
||||
[java.util List Map]
|
||||
[java.util.concurrent TimeUnit]
|
||||
[clojure.lang IPersistentMap ISeq]
|
||||
org.bson.types.ObjectId)
|
||||
(:require [monger.core :as mc]
|
||||
[monger.result :as mres]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.constraints :refer :all]
|
||||
[monger.util :refer [into-array-list]]))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
;;
|
||||
;; monger.collection/insert
|
||||
;;
|
||||
|
||||
(defn ^WriteResult insert
|
||||
"Saves document to collection and returns a write result monger.result/acknowledged?
|
||||
and related functions operate on. You can optionally specify a WriteConcern.
|
||||
|
||||
In case you need the exact inserted document returned, with the :_id key generated,
|
||||
use monger.collection/insert-and-return instead."
|
||||
([^DB db ^String coll document]
|
||||
(.insert (.getCollection db (name coll))
|
||||
(to-db-object document)
|
||||
^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll document ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name coll))
|
||||
(to-db-object document)
|
||||
concern)))
|
||||
|
||||
|
||||
(defn ^clojure.lang.IPersistentMap insert-and-return
|
||||
"Like monger.collection/insert but returns the inserted document as a persistent Clojure map.
|
||||
|
||||
If the :_id key wasn't set on the document, it will be generated and merged into the returned
|
||||
map."
|
||||
([^DB db ^String coll document]
|
||||
(insert-and-return db coll document ^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll document ^WriteConcern concern]
|
||||
;; MongoDB Java driver will generate the _id and set it but it
|
||||
;; tries to mutate the inserted DBObject and it does not work
|
||||
;; very well in our case, because that DBObject is short lived
|
||||
;; and produced from the Clojure map we are passing in. Plus,
|
||||
;; this approach is very awkward with immutable data structures
|
||||
;; being the default. MK.
|
||||
(let [doc (merge {:_id (ObjectId.)} document)]
|
||||
(insert db coll doc concern)
|
||||
doc)))
|
||||
|
||||
|
||||
(defn ^WriteResult insert-batch
|
||||
"Saves documents to collection. You can optionally specify WriteConcern as a third argument."
|
||||
([^DB db ^String coll ^List documents]
|
||||
(.insert (.getCollection db (name coll))
|
||||
^List (to-db-object documents)
|
||||
^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll ^List documents ^WriteConcern concern]
|
||||
(.insert (.getCollection db (name coll))
|
||||
^List (to-db-object documents)
|
||||
concern)))
|
||||
|
||||
;;
|
||||
;; monger.collection/find
|
||||
;;
|
||||
|
||||
(defn ^DBCursor find
|
||||
"Queries for objects in this collection.
|
||||
This function returns DBCursor, which allows you to iterate over DBObjects.
|
||||
If you want to manipulate clojure sequences maps, use find-maps."
|
||||
([^DB db ^String coll]
|
||||
(.find (.getCollection db (name coll))))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(.find (.getCollection db (name coll))
|
||||
(to-db-object ref)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(.find (.getCollection db (name coll))
|
||||
(to-db-object ref)
|
||||
(as-field-selector fields))))
|
||||
|
||||
(defn find-maps
|
||||
"Queries for objects in this collection.
|
||||
This function returns clojure Seq of Maps.
|
||||
If you want to work directly with DBObject, use find."
|
||||
([^DB db ^String coll]
|
||||
(with-open [dbc (find db coll)]
|
||||
(map (fn [x] (from-db-object x true)) dbc)))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(with-open [dbc (find db coll ref)]
|
||||
(map (fn [x] (from-db-object x true)) dbc)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(find-maps db coll ref fields true))
|
||||
([^DB db ^String coll ^Map ref fields keywordize]
|
||||
(with-open [dbc (find db coll ref fields)]
|
||||
(map (fn [x] (from-db-object x keywordize)) dbc))))
|
||||
|
||||
(defn find-seq
|
||||
"Queries for objects in this collection, returns ISeq of DBObjects."
|
||||
([^DB db ^String coll]
|
||||
(with-open [dbc (find db coll)]
|
||||
(seq dbc)))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(with-open [dbc (find db coll ref)]
|
||||
(seq dbc)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(with-open [dbc (find db coll ref fields)]
|
||||
(seq dbc))))
|
||||
|
||||
;;
|
||||
;; monger.collection/find-one
|
||||
;;
|
||||
|
||||
(defn ^DBObject find-one
|
||||
"Returns a single DBObject from this collection matching the query."
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(.findOne (.getCollection db (name coll))
|
||||
(to-db-object ref)))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(.findOne (.getCollection db (name coll))
|
||||
(to-db-object ref)
|
||||
^DBObject (as-field-selector fields))))
|
||||
|
||||
(defn ^IPersistentMap find-one-as-map
|
||||
"Returns a single object converted to Map from this collection matching the query."
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(from-db-object ^DBObject (find-one db coll ref) true))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(from-db-object ^DBObject (find-one db coll ref fields) true))
|
||||
([^DB db ^String coll ^Map ref fields keywordize]
|
||||
(from-db-object ^DBObject (find-one db coll ref fields) keywordize)))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/find-and-modify
|
||||
;;
|
||||
|
||||
(defn ^IPersistentMap find-and-modify
|
||||
"Atomically modify a document (at most one) and return it."
|
||||
([^DB db ^String coll ^Map conditions ^Map document {:keys [fields sort remove return-new upsert keywordize] :or
|
||||
{fields nil
|
||||
sort nil
|
||||
remove false
|
||||
return-new false
|
||||
upsert false
|
||||
keywordize true}}]
|
||||
(let [coll (.getCollection db (name coll))
|
||||
maybe-fields (when fields (as-field-selector fields))
|
||||
maybe-sort (when sort (to-db-object sort))]
|
||||
(from-db-object
|
||||
^DBObject (.findAndModify ^DBCollection coll (to-db-object conditions) maybe-fields maybe-sort remove
|
||||
(to-db-object document) return-new upsert) keywordize))))
|
||||
|
||||
;;
|
||||
;; monger.collection/find-by-id
|
||||
;;
|
||||
|
||||
(defn ^DBObject find-by-id
|
||||
"Returns a single object with matching _id field."
|
||||
([^DB db ^String coll id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db coll {:_id id}))
|
||||
([^DB db ^String coll id fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db coll {:_id id} fields)))
|
||||
|
||||
(defn ^IPersistentMap find-map-by-id
|
||||
"Returns a single object, converted to map with matching _id field."
|
||||
([^DB db ^String coll id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one-as-map db coll {:_id id}))
|
||||
([^DB db ^String coll id fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one-as-map db coll {:_id id} fields))
|
||||
([^DB db ^String coll id fields keywordize]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one-as-map db coll {:_id id} fields keywordize)))
|
||||
|
||||
;;
|
||||
;; monger.collection/count
|
||||
;;
|
||||
|
||||
(defn count
|
||||
"Returns the number of documents in this collection.
|
||||
|
||||
Takes optional conditions as an argument."
|
||||
(^long [^DB db ^String coll]
|
||||
(.count (.getCollection db (name coll))))
|
||||
(^long [^DB db ^String coll ^Map conditions]
|
||||
(.count (.getCollection db (name coll)) (to-db-object conditions))))
|
||||
|
||||
(defn any?
|
||||
"Whether the collection has any items at all, or items matching query."
|
||||
([^DB db ^String coll]
|
||||
(> (count db coll) 0))
|
||||
([^DB db ^String coll ^Map conditions]
|
||||
(> (count db coll conditions) 0)))
|
||||
|
||||
|
||||
(defn empty?
|
||||
"Whether the collection is empty."
|
||||
[^DB db ^String coll]
|
||||
(= (count db coll {}) 0))
|
||||
|
||||
;; monger.collection/update
|
||||
|
||||
(defn ^WriteResult update
|
||||
"Performs an update operation.
|
||||
|
||||
Please note that update is potentially destructive operation. It updates document with the given set
|
||||
emptying the fields not mentioned in the new document. In order to only change certain fields, use
|
||||
\"$set\".
|
||||
|
||||
You can use all the MongoDB modifier operations ($inc, $set, $unset, $push, $pushAll, $addToSet, $pop, $pull
|
||||
$pullAll, $rename, $bit) here as well.
|
||||
|
||||
It also takes options, such as :upsert and :multi.
|
||||
By default :upsert and :multi are false."
|
||||
([^DB db ^String coll ^Map conditions ^Map document]
|
||||
(update db coll conditions document {}))
|
||||
([^DB db ^String coll ^Map conditions ^Map document {:keys [upsert multi write-concern]
|
||||
:or {upsert false
|
||||
multi false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(.update (.getCollection db (name coll))
|
||||
(to-db-object conditions)
|
||||
(to-db-object document)
|
||||
upsert
|
||||
multi
|
||||
write-concern)))
|
||||
|
||||
(defn ^WriteResult upsert
|
||||
"Performs an upsert.
|
||||
|
||||
This is a convenience function that delegates to monger.collection/update and
|
||||
sets :upsert to true.
|
||||
|
||||
See monger.collection/update documentation"
|
||||
([^DB db ^String coll ^Map conditions ^Map document]
|
||||
(upsert db coll conditions document {}))
|
||||
([^DB db ^String coll ^Map conditions ^Map document {:keys [multi write-concern]
|
||||
:or {multi false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(update db coll conditions document {:multi multi :write-concern write-concern :upsert true})))
|
||||
|
||||
(defn ^WriteResult update-by-id
|
||||
"Update a document with given id"
|
||||
([^DB db ^String coll id ^Map document]
|
||||
(update-by-id db coll id document {}))
|
||||
([^DB db ^String coll id ^Map document {:keys [upsert write-concern]
|
||||
:or {upsert false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(.update (.getCollection db (name coll))
|
||||
(to-db-object {:_id id})
|
||||
(to-db-object document)
|
||||
upsert
|
||||
false
|
||||
write-concern)))
|
||||
|
||||
(defn ^WriteResult update-by-ids
|
||||
"Update documents by given ids"
|
||||
([^DB db ^String coll ids ^Map document]
|
||||
(update-by-ids db coll ids document {}))
|
||||
([^DB db ^String coll ids ^Map document {:keys [upsert write-concern]
|
||||
:or {upsert false
|
||||
write-concern mc/*mongodb-write-concern*}}]
|
||||
(check-not-nil! (seq ids) "ids must not be nil or empty")
|
||||
(.update (.getCollection db (name coll))
|
||||
(to-db-object {:_id {"$in" ids}})
|
||||
(to-db-object document)
|
||||
upsert
|
||||
true
|
||||
write-concern)))
|
||||
|
||||
|
||||
;; monger.collection/save
|
||||
|
||||
(defn ^WriteResult save
|
||||
"Saves an object to the given collection (does insert or update based on the object _id).
|
||||
|
||||
If the object is not present in the database, insert operation will be performed.
|
||||
If the object is already in the database, it will be updated.
|
||||
|
||||
This function returns write result. If you want to get the exact persisted document back,
|
||||
use `save-and-return`."
|
||||
([^DB db ^String coll ^Map document]
|
||||
(.save (.getCollection db (name coll))
|
||||
(to-db-object document)
|
||||
mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll ^Map document ^WriteConcern write-concern]
|
||||
(.save (.getCollection db (name coll))
|
||||
(to-db-object document)
|
||||
write-concern)))
|
||||
|
||||
(defn ^clojure.lang.IPersistentMap save-and-return
|
||||
"Saves an object to the given collection (does insert or update based on the object _id).
|
||||
|
||||
If the object is not present in the database, insert operation will be performed.
|
||||
If the object is already in the database, it will be updated.
|
||||
|
||||
This function returns the exact persisted document back, including the `:_id` key in
|
||||
case of an insert.
|
||||
|
||||
If you want to get write result back, use `save`."
|
||||
([^DB db ^String coll ^Map document]
|
||||
(save-and-return db coll document ^WriteConcern mc/*mongodb-write-concern*))
|
||||
([^DB db ^String coll ^Map document ^WriteConcern write-concern]
|
||||
;; see the comment in insert-and-return. Here we additionally need to make sure to not scrap the :_id key if
|
||||
;; it is already present. MK.
|
||||
(let [doc (merge {:_id (ObjectId.)} document)]
|
||||
(save db coll doc write-concern)
|
||||
doc)))
|
||||
|
||||
|
||||
;; monger.collection/remove
|
||||
|
||||
(defn ^WriteResult remove
|
||||
"Removes objects from the database."
|
||||
([^DB db ^String coll]
|
||||
(.remove (.getCollection db (name coll)) (to-db-object {})))
|
||||
([^DB db ^String coll ^Map conditions]
|
||||
(.remove (.getCollection db (name coll)) (to-db-object conditions))))
|
||||
|
||||
|
||||
(defn ^WriteResult remove-by-id
|
||||
"Removes a single document with given id"
|
||||
[^DB db ^String coll id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(let [coll (.getCollection db (name coll))]
|
||||
(.remove coll (to-db-object {:_id id}))))
|
||||
|
||||
(defn purge-many
|
||||
"Purges (removes all documents from) multiple collections. Intended
|
||||
to be used in test environments."
|
||||
[^DB db xs]
|
||||
(doseq [coll xs]
|
||||
(remove db coll)))
|
||||
|
||||
;;
|
||||
;; monger.collection/create-index
|
||||
;;
|
||||
|
||||
(defn create-index
|
||||
"Forces creation of index on a set of fields, if one does not already exists."
|
||||
([^DB db ^String coll ^Map keys]
|
||||
(.createIndex (.getCollection db (name coll)) (as-field-selector keys)))
|
||||
([^DB db ^String coll ^Map keys ^Map options]
|
||||
(.createIndex (.getCollection db (name coll))
|
||||
(as-field-selector keys)
|
||||
(to-db-object options))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/ensure-index
|
||||
;;
|
||||
|
||||
(defn ensure-index
|
||||
"Creates an index on a set of fields, if one does not already exist.
|
||||
This operation is inexpensive in the case when an index already exists.
|
||||
|
||||
Options are:
|
||||
|
||||
:unique (boolean) to create a unique index
|
||||
:name (string) to specify a custom index name and not rely on the generated one"
|
||||
([^DB db ^String coll ^Map keys]
|
||||
(.createIndex (.getCollection db (name coll)) (as-field-selector keys)))
|
||||
([^DB db ^String coll ^Map keys ^Map options]
|
||||
(.createIndex (.getCollection db (name coll))
|
||||
(as-field-selector keys)
|
||||
(to-db-object options)))
|
||||
([^DB db ^String coll ^Map keys ^String index-name unique?]
|
||||
(.createIndex (.getCollection db (name coll))
|
||||
(as-field-selector keys)
|
||||
index-name
|
||||
unique?)))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/indexes-on
|
||||
;;
|
||||
|
||||
(defn indexes-on
|
||||
"Return a list of the indexes for this collection."
|
||||
[^DB db ^String coll]
|
||||
(from-db-object (.getIndexInfo (.getCollection db (name coll))) true))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/drop-index
|
||||
;;
|
||||
|
||||
(defn drop-index
|
||||
"Drops an index from this collection."
|
||||
[^DB db ^String coll idx]
|
||||
(if (string? idx)
|
||||
(.dropIndex (.getCollection db (name coll)) ^String idx)
|
||||
(.dropIndex (.getCollection db (name coll)) (to-db-object idx))))
|
||||
|
||||
(defn drop-indexes
|
||||
"Drops all indixes from this collection."
|
||||
[^DB db ^String coll]
|
||||
(.dropIndexes (.getCollection db (name coll))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/exists?, /create, /drop, /rename
|
||||
;;
|
||||
|
||||
|
||||
(defn exists?
|
||||
"Checks whether collection with certain name exists."
|
||||
([^DB db ^String coll]
|
||||
(.collectionExists db coll)))
|
||||
|
||||
(defn create
|
||||
"Creates a collection with a given name and options.
|
||||
|
||||
Options are:
|
||||
|
||||
:capped (pass true to create a capped collection)
|
||||
:max (number of documents)
|
||||
:size (max allowed size of the collection, in bytes)"
|
||||
[^DB db ^String coll ^Map options]
|
||||
(.createCollection db coll (to-db-object options)))
|
||||
|
||||
(defn drop
|
||||
"Deletes collection from database."
|
||||
[^DB db ^String coll]
|
||||
(.drop (.getCollection db (name coll))))
|
||||
|
||||
(defn rename
|
||||
"Renames collection."
|
||||
([^DB db ^String from, ^String to]
|
||||
(.rename (.getCollection db (name from)) (name to)))
|
||||
([^DB db ^String from ^String to drop-target?]
|
||||
(.rename (.getCollection db (name from)) (name to) drop-target?)))
|
||||
|
||||
;;
|
||||
;; Map/Reduce
|
||||
;;
|
||||
|
||||
(defn map-reduce
|
||||
"Performs a map reduce operation"
|
||||
([^DB db ^String coll ^String js-mapper ^String js-reducer ^String output ^Map query]
|
||||
(let [coll (.getCollection db (name coll))]
|
||||
(.mapReduce coll js-mapper js-reducer output (to-db-object query))))
|
||||
([^DB db ^String coll ^String js-mapper ^String js-reducer ^String output ^MapReduceCommand$OutputType output-type ^Map query]
|
||||
(let [coll (.getCollection db (name coll))]
|
||||
(.mapReduce coll js-mapper js-reducer output output-type (to-db-object query)))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/distinct
|
||||
;;
|
||||
|
||||
(defn distinct
|
||||
"Finds distinct values for a key"
|
||||
([^DB db ^String coll ^String key]
|
||||
(.distinct (.getCollection db (name coll)) ^String (to-db-object key)))
|
||||
([^DB db ^String coll ^String key ^Map query]
|
||||
(.distinct (.getCollection db (name coll)) ^String (to-db-object key) (to-db-object query))))
|
||||
|
||||
|
||||
;;
|
||||
;; Aggregation
|
||||
;;
|
||||
|
||||
(defn- build-aggregation-options
|
||||
^AggregationOptions
|
||||
[{:keys [^Boolean allow-disk-use cursor ^Long max-time]}]
|
||||
(cond-> (AggregationOptions/builder)
|
||||
allow-disk-use (.allowDiskUse allow-disk-use)
|
||||
cursor (.outputMode AggregationOptions$OutputMode/CURSOR)
|
||||
max-time (.maxTime max-time TimeUnit/MILLISECONDS)
|
||||
(:batch-size cursor) (.batchSize (int (:batch-size cursor)))
|
||||
true .build))
|
||||
|
||||
(defn aggregate
|
||||
"Executes an aggregation query. MongoDB 2.2+ only.
|
||||
Accepts the options :allow-disk-use and :cursor (a map with the :batch-size
|
||||
key), as described in the MongoDB manual. Additionally, the :max-time option
|
||||
is supported, for specifying a limit on the execution time of the query in
|
||||
milliseconds.
|
||||
|
||||
:keywordize option that control if resulting map keys will be turned into keywords, default is true.
|
||||
|
||||
See http://docs.mongodb.org/manual/applications/aggregation/ to learn more."
|
||||
[^DB db ^String coll stages & opts]
|
||||
(let [coll (.getCollection db (name coll))
|
||||
agg-opts (build-aggregation-options opts)
|
||||
pipe (into-array-list (to-db-object stages))
|
||||
res (.aggregate coll pipe agg-opts)
|
||||
{:keys [^Boolean keywordize]
|
||||
:or {keywordize true}} opts]
|
||||
(map #(from-db-object % keywordize) (iterator-seq res))))
|
||||
|
||||
(defn explain-aggregate
|
||||
"Returns the explain plan for an aggregation query. MongoDB 2.2+ only.
|
||||
|
||||
See http://docs.mongodb.org/manual/applications/aggregation/ to learn more."
|
||||
[^DB db ^String coll stages & opts]
|
||||
(let [coll (.getCollection db (name coll))
|
||||
agg-opts (build-aggregation-options opts)
|
||||
pipe (into-array-list (to-db-object stages))
|
||||
res (.explainAggregate coll pipe agg-opts)]
|
||||
(from-db-object res true)))
|
||||
;;
|
||||
;; Misc
|
||||
;;
|
||||
|
||||
(def ^{:const true}
|
||||
system-collection-pattern #"^(system|fs)")
|
||||
|
||||
(defn system-collection?
|
||||
"Evaluates to true if the given collection name refers to a system collection. System collections
|
||||
are prefixed with system. or fs. (default GridFS collection prefix)"
|
||||
[^String coll]
|
||||
(re-find system-collection-pattern coll))
|
||||
|
|
@ -1,108 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.command
|
||||
"Provides convenience functions for performing most commonly used MongoDB commands.
|
||||
For a lower-level API that gives maximum flexibility, see `monger.core/command`. To use
|
||||
MongoDB 2.2 Aggregation Framework, see `monger.collection/aggregate`.
|
||||
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/aggregation.html
|
||||
* http://clojuremongodb.info/articles/mapreduce.html"
|
||||
(:require monger.core
|
||||
[monger.conversion :refer :all])
|
||||
(:import [com.mongodb MongoClient DB DBObject]))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn admin-command
|
||||
"Executes a command on the admin database"
|
||||
[^MongoClient conn m]
|
||||
(monger.core/command (monger.core/admin-db conn) m))
|
||||
|
||||
(defn raw-admin-command
|
||||
"Executes a command on the admin database"
|
||||
[^MongoClient conn ^DBObject cmd]
|
||||
(monger.core/raw-command (monger.core/admin-db conn) cmd))
|
||||
|
||||
(defn collection-stats
|
||||
[^DB database collection]
|
||||
(monger.core/command database {:collstats collection}))
|
||||
|
||||
(defn db-stats
|
||||
[^DB database]
|
||||
(monger.core/command database {:dbStats 1}))
|
||||
|
||||
|
||||
(defn reindex-collection
|
||||
"Forces an existing collection to be reindexed using the reindexCollection command"
|
||||
[^DB database ^String collection]
|
||||
(monger.core/command database {:reIndex collection}))
|
||||
|
||||
(defn rename-collection
|
||||
"Changes the name of an existing collection using the renameCollection command"
|
||||
[^DB db ^String from ^String to]
|
||||
(monger.core/command db (sorted-map :renameCollection from :to to)))
|
||||
|
||||
(defn convert-to-capped
|
||||
"Converts an existing, non-capped collection to a capped collection using the convertToCapped command"
|
||||
[^DB db ^String collection ^long size]
|
||||
(monger.core/command db (sorted-map :convertToCapped collection :size size)))
|
||||
|
||||
(defn empty-capped
|
||||
"Removes all documents from a capped collection using the emptycapped command"
|
||||
[^DB db ^String collection]
|
||||
(monger.core/command db {:emptycapped collection}))
|
||||
|
||||
|
||||
(defn compact
|
||||
"Rewrites and defragments a single collection using the compact command. This also forces all indexes on the collection to be rebuilt"
|
||||
[^DB db ^String collection]
|
||||
(monger.core/command db {:compact collection}))
|
||||
|
||||
|
||||
(defn server-status
|
||||
[^DB db]
|
||||
(monger.core/command db {:serverStatus 1}))
|
||||
|
||||
|
||||
(defn top
|
||||
[^MongoClient conn]
|
||||
(monger.core/command (monger.core/admin-db conn) {:top 1}))
|
||||
|
|
@ -1,44 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.constraints)
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(definline check-not-nil!
|
||||
[ref ^String message]
|
||||
`(when (nil? ~ref)
|
||||
(throw (IllegalArgumentException. ~message))))
|
||||
|
|
@ -1,180 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Portions of the code are Copyright (c) 2009 Andrew Boekhoff
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Portions of the code are Copyright (c) 2009 Andrew Boekhoff
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.conversion
|
||||
"Provides functions that convert between MongoDB Java driver classes (DBObject, DBList) and Clojure
|
||||
data structures (maps, collections). Most of the time, application developers won't need to use these
|
||||
functions directly because Monger Query DSL and many other functions convert documents to Clojure sequences and
|
||||
maps automatically. However, this namespace is part of the public API and guaranteed to be stable between minor releases.
|
||||
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/inserting.html
|
||||
* http://clojuremongodb.info/articles/querying.html"
|
||||
(:import [com.mongodb DBObject BasicDBObject BasicDBList DBCursor]
|
||||
[clojure.lang IPersistentMap Named Keyword Ratio]
|
||||
[java.util List Map Date Set]
|
||||
org.bson.types.ObjectId
|
||||
(org.bson.types Decimal128)))
|
||||
|
||||
(defprotocol ConvertToDBObject
|
||||
(^com.mongodb.DBObject to-db-object [input] "Converts given piece of Clojure data to BasicDBObject MongoDB Java driver uses"))
|
||||
|
||||
(extend-protocol ConvertToDBObject
|
||||
nil
|
||||
(to-db-object [input]
|
||||
nil)
|
||||
|
||||
String
|
||||
(to-db-object [^String input]
|
||||
input)
|
||||
|
||||
Boolean
|
||||
(to-db-object [^Boolean input]
|
||||
input)
|
||||
|
||||
java.util.Date
|
||||
(to-db-object [^java.util.Date input]
|
||||
input)
|
||||
|
||||
Ratio
|
||||
(to-db-object [^Ratio input]
|
||||
(double input))
|
||||
|
||||
Keyword
|
||||
(to-db-object [^Keyword input] (.getName input))
|
||||
|
||||
Named
|
||||
(to-db-object [^Named input] (.getName input))
|
||||
|
||||
IPersistentMap
|
||||
(to-db-object [^IPersistentMap input]
|
||||
(let [o (BasicDBObject.)]
|
||||
(doseq [[k v] input]
|
||||
(.put o (to-db-object k) (to-db-object v)))
|
||||
o))
|
||||
|
||||
List
|
||||
(to-db-object [^List input] (map to-db-object input))
|
||||
|
||||
Set
|
||||
(to-db-object [^Set input] (map to-db-object input))
|
||||
|
||||
DBObject
|
||||
(to-db-object [^DBObject input] input)
|
||||
|
||||
com.mongodb.DBRef
|
||||
(to-db-object [^com.mongodb.DBRef dbref]
|
||||
dbref)
|
||||
|
||||
Object
|
||||
(to-db-object [input]
|
||||
input))
|
||||
|
||||
|
||||
|
||||
(defprotocol ConvertFromDBObject
|
||||
(from-db-object [input keywordize] "Converts given DBObject instance to a piece of Clojure data"))
|
||||
|
||||
(extend-protocol ConvertFromDBObject
|
||||
nil
|
||||
(from-db-object [_ _] nil)
|
||||
|
||||
Object
|
||||
(from-db-object [input _] input)
|
||||
|
||||
Decimal128
|
||||
(from-db-object [^Decimal128 input _]
|
||||
(.bigDecimalValue input))
|
||||
|
||||
List
|
||||
(from-db-object [^List input keywordize]
|
||||
(mapv #(from-db-object % keywordize) input))
|
||||
|
||||
BasicDBList
|
||||
(from-db-object [^BasicDBList input keywordize]
|
||||
(mapv #(from-db-object % keywordize) input))
|
||||
|
||||
com.mongodb.DBRef
|
||||
(from-db-object [^com.mongodb.DBRef input _]
|
||||
input)
|
||||
|
||||
DBObject
|
||||
(from-db-object [^DBObject input keywordize]
|
||||
;; DBObject provides .toMap, but the implementation in
|
||||
;; subclass GridFSFile unhelpfully throws
|
||||
;; UnsupportedOperationException.
|
||||
(persistent!
|
||||
(reduce (if keywordize
|
||||
(fn [m ^String k]
|
||||
(assoc! m (keyword k) (from-db-object (.get input k) true)))
|
||||
(fn [m ^String k]
|
||||
(assoc! m k (from-db-object (.get input k) false))))
|
||||
(transient {}) (.keySet input)))))
|
||||
|
||||
|
||||
(defprotocol ConvertToObjectId
|
||||
(^org.bson.types.ObjectId to-object-id [input] "Instantiates ObjectId from input unless the input itself is an ObjectId instance. In that case, returns input as is."))
|
||||
|
||||
(extend-protocol ConvertToObjectId
|
||||
String
|
||||
(to-object-id [^String input]
|
||||
(ObjectId. input))
|
||||
|
||||
Date
|
||||
(to-object-id [^Date input]
|
||||
(ObjectId. input))
|
||||
|
||||
ObjectId
|
||||
(to-object-id [^ObjectId input]
|
||||
input))
|
||||
|
||||
|
||||
|
||||
(defprotocol FieldSelector
|
||||
(^com.mongodb.DBObject as-field-selector [input] "Converts values to DBObject that can be used to specify a list of document fields (including negation support)"))
|
||||
|
||||
(extend-protocol FieldSelector
|
||||
DBObject
|
||||
(as-field-selector [^DBObject input]
|
||||
input)
|
||||
|
||||
List
|
||||
(as-field-selector [^List input]
|
||||
(to-db-object (zipmap input (repeat 1))))
|
||||
|
||||
Object
|
||||
(as-field-selector [input]
|
||||
(to-db-object input)))
|
||||
|
|
@ -1,310 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.core
|
||||
"Thin idiomatic wrapper around MongoDB Java client. monger.core includes
|
||||
fundamental functions that perform database/replica set connection, set default write concern, default database, performing commands
|
||||
and so on. Most of the functionality is in other monger.* namespaces, in particular monger.collection, monger.query and monger.gridfs
|
||||
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/connecting.html
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/gridfs.html"
|
||||
(:refer-clojure :exclude [count])
|
||||
(:require [monger.conversion :refer :all]
|
||||
[monger.util :refer [into-array-list]])
|
||||
(:import [com.mongodb MongoClient MongoClientURI MongoCredential DB WriteConcern DBObject DBCursor Bytes
|
||||
MongoClientOptions MongoClientOptions$Builder ServerAddress MapReduceOutput MongoException]
|
||||
[com.mongodb.gridfs GridFS]
|
||||
[java.util Map]))
|
||||
|
||||
;;
|
||||
;; Defaults
|
||||
;;
|
||||
|
||||
(def ^:dynamic ^String *mongodb-host* "127.0.0.1")
|
||||
(def ^:dynamic ^long *mongodb-port* 27017)
|
||||
|
||||
(def ^:dynamic ^WriteConcern *mongodb-write-concern* WriteConcern/ACKNOWLEDGED)
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^MongoClient connect
|
||||
"Connects to MongoDB. When used without arguments, connects to
|
||||
|
||||
Arguments:
|
||||
:host (\"127.0.0.1\" by default)
|
||||
:port (27017 by default)"
|
||||
{:arglists '([]
|
||||
[server-address options]
|
||||
[server-address options credentials]
|
||||
[[server-address & more] options]
|
||||
[{:keys [host port uri] :or { host *mongodb-host* port *mongodb-port*}}])}
|
||||
([]
|
||||
(MongoClient.))
|
||||
([server-address ^MongoClientOptions options]
|
||||
(if (coll? server-address)
|
||||
;; connect to a replica set
|
||||
(let [server-list (into-array-list server-address)]
|
||||
(MongoClient. server-list options))
|
||||
;; connect to a single instance
|
||||
(MongoClient. ^ServerAddress server-address options)))
|
||||
([server-address ^MongoClientOptions options credentials]
|
||||
(let [creds (into-array-list (if (coll? credentials)
|
||||
credentials
|
||||
[credentials]))]
|
||||
(if (coll? server-address)
|
||||
(let [server-list (into-array-list server-address)]
|
||||
(MongoClient. server-list ^java.util.List creds options))
|
||||
(MongoClient. ^ServerAddress server-address ^java.util.List creds options))))
|
||||
([{ :keys [host port uri] :or { host *mongodb-host* port *mongodb-port* }}]
|
||||
(if uri
|
||||
(MongoClient. (MongoClientURI. uri))
|
||||
(MongoClient. ^String host ^Long port))))
|
||||
|
||||
(defn ^MongoClient connect-with-credentials
|
||||
"Connect with provided credentials and default options"
|
||||
([credentials]
|
||||
(connect-with-credentials *mongodb-host* *mongodb-port* credentials))
|
||||
([^String hostname credentials]
|
||||
(connect-with-credentials hostname *mongodb-port* credentials))
|
||||
([^String hostname ^long port credentials]
|
||||
(MongoClient. (into-array-list [(ServerAddress. hostname port)])
|
||||
(into-array-list (if (coll? credentials)
|
||||
credentials
|
||||
[credentials])))))
|
||||
|
||||
(defn get-db-names
|
||||
"Gets a list of all database names present on the server"
|
||||
[^MongoClient conn]
|
||||
(set (.getDatabaseNames conn)))
|
||||
|
||||
|
||||
(defn ^DB get-db
|
||||
"Get database reference by name."
|
||||
[^MongoClient conn ^String name]
|
||||
(.getDB conn name))
|
||||
|
||||
(defn drop-db
|
||||
"Drops a database"
|
||||
[^MongoClient conn ^String db]
|
||||
(.dropDatabase conn db))
|
||||
|
||||
(defn ^GridFS get-gridfs
|
||||
"Get GridFS for the given database."
|
||||
[^MongoClient conn ^String name]
|
||||
(GridFS. (.getDB conn name)))
|
||||
|
||||
(defn server-address
|
||||
([^String hostname]
|
||||
(ServerAddress. hostname))
|
||||
([^String hostname ^Long port]
|
||||
(ServerAddress. hostname port)))
|
||||
|
||||
(defn ^MongoClientOptions$Builder mongo-options-builder
|
||||
[{:keys [add-cluster-listener add-cluster-listeners add-command-listener add-command-listeners
|
||||
add-connection-pool-listener add-connection-pool-listeners add-server-listener add-server-listeners
|
||||
add-server-monitor-listener add-server-monitor-listeners always-use-mbeans application-name
|
||||
codec-registry compressor-list connect-timeout connections-per-host cursor-finalizer-enabled
|
||||
db-decoder-factory db-encoder-factory description heartbeat-connect-timeout heartbeat-frequency
|
||||
heartbeat-socket-timeout local-threshold max-connection-idle-time max-connection-life-time
|
||||
max-wait-time min-connections-per-host min-heartbeat-frequency read-concern read-preference
|
||||
required-replica-set-name retry-writes server-selection-timeout server-selector socket-keep-alive
|
||||
socket-factory socket-timeout ssl-context ssl-enabled ssl-invalid-host-name-allowed
|
||||
threads-allowed-to-block-for-connection-multiplier uuid-representation write-concern]}]
|
||||
(let [mob (MongoClientOptions$Builder.)]
|
||||
(when add-cluster-listener
|
||||
(.addClusterListener mob add-cluster-listener))
|
||||
(when add-cluster-listeners
|
||||
(doseq [cluster-listener add-cluster-listeners]
|
||||
(.addClusterListener mob cluster-listener)))
|
||||
(when add-command-listener
|
||||
(.addCommandListener mob add-command-listener))
|
||||
(when add-command-listeners
|
||||
(doseq [command-listener add-command-listeners]
|
||||
(.addCommandListener mob command-listener)))
|
||||
(when add-connection-pool-listener
|
||||
(.addConnectionPoolListener mob add-connection-pool-listener))
|
||||
(when add-connection-pool-listeners
|
||||
(doseq [connection-pool-listener add-connection-pool-listeners]
|
||||
(.addConnectionPoolListener mob connection-pool-listener)))
|
||||
(when add-server-listener
|
||||
(.addServerListener mob add-server-listener))
|
||||
(when add-server-listeners
|
||||
(doseq [server-listener add-server-listeners]
|
||||
(.addServerListener mob server-listener)))
|
||||
(when add-server-monitor-listener
|
||||
(.addServerMonitorListener mob add-server-monitor-listener))
|
||||
(when add-server-monitor-listeners
|
||||
(doseq [server-monitor-listener add-server-monitor-listeners]
|
||||
(.addServerMonitorListener mob server-monitor-listener)))
|
||||
(when always-use-mbeans
|
||||
(.alwaysUseMBeans mob always-use-mbeans))
|
||||
(when application-name
|
||||
(.applicationName mob application-name))
|
||||
(when always-use-mbeans
|
||||
(.alwaysUseMBeans mob always-use-mbeans))
|
||||
(when codec-registry
|
||||
(.codecRegistry mob codec-registry))
|
||||
(when compressor-list
|
||||
(.compressorList mob compressor-list))
|
||||
(when connections-per-host
|
||||
(.connectionsPerHost mob connections-per-host))
|
||||
(when connect-timeout
|
||||
(.connectTimeout mob connect-timeout))
|
||||
(when cursor-finalizer-enabled
|
||||
(.cursorFinalizerEnabled mob cursor-finalizer-enabled))
|
||||
(when db-decoder-factory
|
||||
(.dbDecoderFactory mob db-decoder-factory))
|
||||
(when db-encoder-factory
|
||||
(.dbEncoderFactory mob db-encoder-factory))
|
||||
(when description
|
||||
(.description mob description))
|
||||
(when heartbeat-connect-timeout
|
||||
(.heartbeatConnectTimeout mob heartbeat-connect-timeout))
|
||||
(when heartbeat-frequency
|
||||
(.heartbeatFrequency mob heartbeat-frequency))
|
||||
(when heartbeat-socket-timeout
|
||||
(.heartbeatSocketTimeout mob heartbeat-socket-timeout))
|
||||
(when ssl-context
|
||||
(.sslContext mob ssl-context))
|
||||
(when local-threshold
|
||||
(.localThreshold mob local-threshold))
|
||||
(when max-connection-idle-time
|
||||
(.maxConnectionIdleTime mob max-connection-idle-time))
|
||||
(when max-wait-time
|
||||
(.maxWaitTime mob max-wait-time))
|
||||
(when max-connection-life-time
|
||||
(.maxConnectionLifeTime mob max-connection-life-time))
|
||||
(when min-connections-per-host
|
||||
(.minConnectionsPerHost mob min-connections-per-host))
|
||||
(when min-heartbeat-frequency
|
||||
(.minHeartbeatFrequency mob min-heartbeat-frequency))
|
||||
(when read-concern
|
||||
(.readConcern mob read-concern))
|
||||
(when read-preference
|
||||
(.readPreference mob read-preference))
|
||||
(when required-replica-set-name
|
||||
(.requiredReplicaSetName mob required-replica-set-name))
|
||||
(when retry-writes
|
||||
(.retryWrites mob retry-writes))
|
||||
(when server-selection-timeout
|
||||
(.serverSelectionTimeout mob server-selection-timeout))
|
||||
(when server-selector
|
||||
(.serverSelector mob server-selector))
|
||||
(when socket-keep-alive
|
||||
(.socketKeepAlive mob socket-keep-alive))
|
||||
(when socket-factory
|
||||
(.socketFactory mob socket-factory))
|
||||
(when socket-timeout
|
||||
(.socketTimeout mob socket-timeout))
|
||||
(when ssl-enabled
|
||||
(.sslEnabled mob ssl-enabled))
|
||||
(when ssl-invalid-host-name-allowed
|
||||
(.sslInvalidHostNameAllowed mob ssl-invalid-host-name-allowed))
|
||||
(when threads-allowed-to-block-for-connection-multiplier
|
||||
(.threadsAllowedToBlockForConnectionMultiplier mob threads-allowed-to-block-for-connection-multiplier))
|
||||
(when uuid-representation
|
||||
(.uuidRepresentation mob uuid-representation))
|
||||
(when write-concern
|
||||
(.writeConcern mob write-concern))
|
||||
mob))
|
||||
|
||||
(defn ^MongoClientOptions mongo-options
|
||||
[opts]
|
||||
(let [mob (mongo-options-builder opts)]
|
||||
(.build mob)))
|
||||
|
||||
(defn disconnect
|
||||
"Closes default connection to MongoDB"
|
||||
[^MongoClient conn]
|
||||
(.close conn))
|
||||
|
||||
(def ^:const admin-db-name "admin")
|
||||
|
||||
(defn ^DB admin-db
|
||||
"Returns admin database"
|
||||
[^MongoClient conn]
|
||||
(get-db conn admin-db-name))
|
||||
|
||||
|
||||
(defn set-default-write-concern!
|
||||
[wc]
|
||||
"Sets *mongodb-write-concert*"
|
||||
(alter-var-root #'*mongodb-write-concern* (constantly wc)))
|
||||
|
||||
|
||||
(defn connect-via-uri
|
||||
"Connects to MongoDB using a URI, returns the connection and database as a map with :conn and :db.
|
||||
Commonly used for PaaS-based applications, for example, running on Heroku.
|
||||
If username and password are provided, performs authentication."
|
||||
[^String uri-string]
|
||||
(let [uri (MongoClientURI. uri-string)
|
||||
conn (MongoClient. uri)]
|
||||
(if-let [dbName (.getDatabase uri)]
|
||||
{:conn conn :db (.getDB conn dbName)}
|
||||
(throw (IllegalArgumentException. "No database name specified in URI. Monger requires a database to be explicitly configured.")))))
|
||||
|
||||
(defn ^com.mongodb.CommandResult command
|
||||
"Runs a database command (please check MongoDB documentation for the complete list of commands).
|
||||
|
||||
Ordering of keys in the command document may matter. Please use sorted maps instead of map literals, for example:
|
||||
(array-map :near 50 :test 430 :num 10)
|
||||
|
||||
For commonly used commands (distinct, count, map/reduce, etc), use monger.command and monger.collection functions such as
|
||||
/distinct, /count, /drop, /dropIndexes, and /mapReduce respectively."
|
||||
[^DB database ^Map cmd]
|
||||
(.command ^DB database ^DBObject (to-db-object cmd)))
|
||||
|
||||
(defn ^com.mongodb.CommandResult raw-command
|
||||
"Like monger.core/command but accepts DBObjects"
|
||||
[^DB database ^DBObject cmd]
|
||||
(.command database cmd))
|
||||
|
||||
(defprotocol Countable
|
||||
(count [this] "Returns size of the object"))
|
||||
|
||||
(extend-protocol Countable
|
||||
DBCursor
|
||||
(count [^DBCursor this]
|
||||
(.count this))
|
||||
|
||||
MapReduceOutput
|
||||
(count [^MapReduceOutput this]
|
||||
;; MongoDB Java driver could use a lot more specific type than Iterable but
|
||||
;; it always uses DBCollection#find to popular result set. MK.
|
||||
(.count ^DBCursor (.results this))))
|
||||
|
|
@ -1,56 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.credentials
|
||||
"Helper functions for instantiating various types
|
||||
of credentials."
|
||||
(:require [clojurewerkz.support.chars :refer :all])
|
||||
(:import [com.mongodb MongoCredential]))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^MongoCredential create
|
||||
"Creates a MongoCredential instance with an unspecified mechanism.
|
||||
The client will negotiate the best mechanism based on the
|
||||
version of the server that the client is authenticating to."
|
||||
[^String username ^String database pwd]
|
||||
(MongoCredential/createCredential username database (to-char-array pwd)))
|
||||
|
||||
(defn ^MongoCredential x509
|
||||
"Creates a MongoCredential instance for the X509-based authentication
|
||||
protocol."
|
||||
[^String username]
|
||||
(MongoCredential/createMongoX509Credential username))
|
||||
|
||||
|
|
@ -1,143 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.cursor
|
||||
"Helper-functions for dbCursor object:
|
||||
* to initialize new cursor,
|
||||
* for CRUD functionality of options of dbCursor"
|
||||
(:import [com.mongodb DB DBCursor Bytes]
|
||||
[java.util List Map]
|
||||
[java.lang Integer]
|
||||
[clojure.lang Keyword])
|
||||
(:require [monger.core]
|
||||
[monger.conversion :refer [to-db-object from-db-object as-field-selector]]))
|
||||
|
||||
(defn ^DBCursor make-db-cursor
|
||||
"initializes new db-cursor."
|
||||
([^DB db ^String coll]
|
||||
(make-db-cursor db coll {} {}))
|
||||
([^DB db ^String coll ^Map ref]
|
||||
(make-db-cursor db coll ref {}))
|
||||
([^DB db ^String coll ^Map ref fields]
|
||||
(.find
|
||||
(.getCollection db (name coll))
|
||||
(to-db-object ref)
|
||||
(as-field-selector fields))))
|
||||
|
||||
(def cursor-options {:awaitdata Bytes/QUERYOPTION_AWAITDATA
|
||||
;;:exhaust Bytes/QUERYOPTION_EXHAUST - not human settable
|
||||
:notimeout Bytes/QUERYOPTION_NOTIMEOUT
|
||||
:oplogreplay Bytes/QUERYOPTION_OPLOGREPLAY
|
||||
:partial Bytes/QUERYOPTION_PARTIAL
|
||||
:slaveok Bytes/QUERYOPTION_SLAVEOK
|
||||
:tailable Bytes/QUERYOPTION_TAILABLE})
|
||||
|
||||
(defn get-options
|
||||
"Returns map of cursor's options with current state."
|
||||
[^DBCursor db-cur]
|
||||
(into {}
|
||||
(for [[opt option-mask] cursor-options]
|
||||
[opt (< 0 (bit-and (.getOptions db-cur) option-mask))])))
|
||||
|
||||
(defn add-option!
|
||||
[^DBCursor db-cur ^String opt]
|
||||
(.addOption db-cur (get cursor-options (keyword opt) 0)))
|
||||
|
||||
(defn remove-option!
|
||||
[^DBCursor db-cur ^String opt]
|
||||
(.setOptions db-cur (bit-and-not (.getOptions db-cur)
|
||||
(get cursor-options (keyword opt) 0))))
|
||||
|
||||
(defmulti add-options (fn [db-cur opts] (class opts)))
|
||||
(defmethod add-options Map [^DBCursor db-cur options]
|
||||
"Changes options by using map of settings, which key specifies name of settings
|
||||
and boolean value specifies new state of the setting.
|
||||
usage:
|
||||
(add-options db-cur {:notimeout true, :tailable false})
|
||||
returns:
|
||||
^DBCursor object."
|
||||
(doseq [[opt value] (seq options)]
|
||||
(if (= true value)
|
||||
(add-option! db-cur opt)
|
||||
(remove-option! db-cur opt)))
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options List [^DBCursor db-cur options]
|
||||
"Takes list of options and activates these options
|
||||
usage:
|
||||
(add-options db-cur [:notimeout :tailable])
|
||||
returns:
|
||||
^DBCursor object"
|
||||
(doseq [opt (seq options)]
|
||||
(add-option! db-cur opt))
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options Integer [^DBCursor db-cur, option]
|
||||
"Takes com.mongodb.Byte value and adds it to current settings.
|
||||
usage:
|
||||
(add-options db-cur com.mongodb.Bytes/QUERYOPTION_NOTIMEOUT)
|
||||
returns:
|
||||
^DBCursor object"
|
||||
(.addOption db-cur option)
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options Keyword [^DBCursor db-cur, option]
|
||||
"Takes just one keyword as name of settings and applies it to the db-cursor.
|
||||
usage:
|
||||
(add-options db-cur :notimeout)
|
||||
returns:
|
||||
^DBCursor object"
|
||||
(add-option! db-cur option)
|
||||
db-cur)
|
||||
|
||||
(defmethod add-options :default [^DBCursor db-cur, options]
|
||||
"Using add-options with not supported type of options just passes unchanged cursor"
|
||||
db-cur)
|
||||
|
||||
(defn ^DBCursor reset-options
|
||||
"Resets cursor options to default value and returns cursor"
|
||||
[^DBCursor db-cur]
|
||||
(.resetOptions db-cur)
|
||||
db-cur)
|
||||
|
||||
(defmulti format-as (fn [db-cur as] as))
|
||||
|
||||
(defmethod format-as :map [db-cur as]
|
||||
(map #(from-db-object %1 true) db-cur))
|
||||
|
||||
(defmethod format-as :seq [db-cur as]
|
||||
(seq db-cur))
|
||||
|
||||
(defmethod format-as :default [db-cur as]
|
||||
db-cur)
|
||||
|
||||
|
|
@ -1,62 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.db
|
||||
"Functions that provide operations on databases"
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty?])
|
||||
(:import [com.mongodb Mongo DB DBCollection])
|
||||
(:require monger.core
|
||||
[monger.conversion :refer :all]))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn add-user
|
||||
"Adds a new user for this db"
|
||||
[^DB db ^String username ^chars password]
|
||||
(.addUser db username password))
|
||||
|
||||
|
||||
(defn drop-db
|
||||
"Drops the currently set database (via core/set-db) or the specified database."
|
||||
[^DB db]
|
||||
(.dropDatabase db))
|
||||
|
||||
(defn get-collection-names
|
||||
"Returns a set containing the names of all collections in this database."
|
||||
([^DB db]
|
||||
(set (.getCollectionNames db))))
|
||||
|
|
@ -1,212 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.gridfs
|
||||
"Provides functions and macros for working with GridFS: storing files in GridFS, streaming files from GridFS,
|
||||
finding stored files.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/gridfs.html"
|
||||
(:refer-clojure :exclude [remove find])
|
||||
(:require monger.core
|
||||
[clojure.java.io :as io]
|
||||
[monger.conversion :refer :all]
|
||||
[clojurewerkz.support.fn :refer [fpartial]])
|
||||
(:import [com.mongodb DB DBObject]
|
||||
org.bson.types.ObjectId
|
||||
[com.mongodb.gridfs GridFS GridFSInputFile]
|
||||
[java.io InputStream ByteArrayInputStream File]))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(def
|
||||
^{:doc "Type object for a Java primitive byte array."
|
||||
:private true
|
||||
}
|
||||
byte-array-type (class (make-array Byte/TYPE 0)))
|
||||
|
||||
;; ...
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
|
||||
(defn remove
|
||||
[^GridFS fs query]
|
||||
(.remove fs ^DBObject (to-db-object query)))
|
||||
|
||||
(defn remove-all
|
||||
[^GridFS fs]
|
||||
(remove fs {}))
|
||||
|
||||
(defn all-files
|
||||
([^GridFS fs]
|
||||
(.getFileList fs (to-db-object {})))
|
||||
([^GridFS fs query]
|
||||
(.getFileList fs query)))
|
||||
|
||||
(def ^{:private true} converter
|
||||
(fpartial from-db-object true))
|
||||
|
||||
(defn files-as-maps
|
||||
([^GridFS fs]
|
||||
(files-as-maps fs {}))
|
||||
([^GridFS fs query]
|
||||
(map converter (all-files fs (to-db-object query)))))
|
||||
|
||||
|
||||
;;
|
||||
;; Plumbing (low-level API)
|
||||
;;
|
||||
|
||||
(defprotocol InputStreamFactory
|
||||
(^InputStream to-input-stream [input] "Makes InputStream out of the given input"))
|
||||
|
||||
(extend byte-array-type
|
||||
InputStreamFactory
|
||||
{:to-input-stream (fn [^bytes input]
|
||||
(ByteArrayInputStream. input))})
|
||||
|
||||
(extend-protocol InputStreamFactory
|
||||
String
|
||||
(to-input-stream [^String input]
|
||||
(io/make-input-stream input {:encoding "UTF-8"}))
|
||||
|
||||
File
|
||||
(to-input-stream [^File input]
|
||||
(io/make-input-stream input {:encoding "UTF-8"}))
|
||||
|
||||
InputStream
|
||||
(to-input-stream [^InputStream input]
|
||||
input))
|
||||
|
||||
(defprotocol GridFSInputFileFactory
|
||||
(^GridFSInputFile create-gridfs-file [input ^GridFS fs] "Creates a file entry"))
|
||||
|
||||
(extend byte-array-type
|
||||
GridFSInputFileFactory
|
||||
{:create-gridfs-file (fn [^bytes input ^GridFS fs]
|
||||
(.createFile fs input))})
|
||||
|
||||
(extend-protocol GridFSInputFileFactory
|
||||
String
|
||||
(create-gridfs-file [^String input ^GridFS fs]
|
||||
(.createFile fs (io/file input)))
|
||||
|
||||
File
|
||||
(create-gridfs-file [^File input ^GridFS fs]
|
||||
(.createFile fs input))
|
||||
|
||||
InputStream
|
||||
(create-gridfs-file [^InputStream input ^GridFS fs]
|
||||
(.createFile fs input)))
|
||||
|
||||
(defn ^GridFSInputFile make-input-file
|
||||
[^GridFS fs input]
|
||||
(create-gridfs-file input fs))
|
||||
|
||||
(defmacro store
|
||||
[^GridFSInputFile input & body]
|
||||
`(let [^GridFSInputFile f# (doto ~input ~@body)]
|
||||
(.save f# GridFS/DEFAULT_CHUNKSIZE)
|
||||
(from-db-object f# true)))
|
||||
|
||||
;;
|
||||
;; Higher-level API
|
||||
;;
|
||||
|
||||
(defn save
|
||||
[^GridFSInputFile input]
|
||||
(.save input GridFS/DEFAULT_CHUNKSIZE)
|
||||
(from-db-object input true))
|
||||
|
||||
(defn filename
|
||||
[^GridFSInputFile input ^String s]
|
||||
(.setFilename input s)
|
||||
input)
|
||||
|
||||
(defn content-type
|
||||
[^GridFSInputFile input ^String s]
|
||||
(.setContentType input s)
|
||||
input)
|
||||
|
||||
(defn metadata
|
||||
[^GridFSInputFile input md]
|
||||
(.setMetaData input (to-db-object md))
|
||||
input)
|
||||
|
||||
(defmacro store-file
|
||||
[^GridFSInputFile input & body]
|
||||
`(let [f# (-> ~input ~@body)]
|
||||
(save f#)))
|
||||
|
||||
|
||||
;;
|
||||
;; Finders
|
||||
;;
|
||||
|
||||
(defn find
|
||||
[^GridFS fs query]
|
||||
(.find fs (to-db-object query)))
|
||||
|
||||
(defn find-by-filename
|
||||
[^GridFS fs ^String filename]
|
||||
(.find fs (to-db-object {"filename" filename})))
|
||||
|
||||
(defn find-by-md5
|
||||
[^GridFS fs ^String md5]
|
||||
(.find fs (to-db-object {"md5" md5})))
|
||||
|
||||
(defn find-one
|
||||
[^GridFS fs query]
|
||||
(.findOne fs (to-db-object query)))
|
||||
|
||||
(defn find-maps
|
||||
[^GridFS fs query]
|
||||
(map converter (find fs query)))
|
||||
|
||||
(defn find-one-as-map
|
||||
[^GridFS fs query]
|
||||
(converter (find-one fs query)))
|
||||
|
||||
(defn find-by-id
|
||||
[^GridFS fs ^ObjectId id]
|
||||
(.findOne fs id))
|
||||
|
||||
(defn find-map-by-id
|
||||
[^GridFS fs ^ObjectId id]
|
||||
(converter (find-by-id fs id)))
|
||||
|
|
@ -1,39 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.internal.pagination)
|
||||
|
||||
(defn offset-for
|
||||
[^long page ^long per-page]
|
||||
(* per-page
|
||||
(- (max page 1) 1)))
|
||||
|
|
@ -1,83 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.joda-time
|
||||
"An optional convenience namespaces for applications that heavily use dates and would prefer use JodaTime types
|
||||
transparently when storing and loading them from MongoDB and serializing to JSON and/or with Clojure reader.
|
||||
|
||||
Enables automatic conversion of JodaTime date/time/instant instances to JDK dates (java.util.Date) when documents
|
||||
are serialized and the other way around when documents are loaded. Extends clojure.data.json/Write-JSON protocol for
|
||||
JodaTime types.
|
||||
|
||||
To use it, make sure you add dependencies on clj-time (or JodaTime) and clojure.data.json."
|
||||
(:import [org.joda.time DateTime DateTimeZone ReadableInstant]
|
||||
[org.joda.time.format ISODateTimeFormat])
|
||||
(:require [monger.conversion :refer :all]))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(extend-protocol ConvertToDBObject
|
||||
org.joda.time.base.AbstractInstant
|
||||
(to-db-object [^AbstractInstant input]
|
||||
(to-db-object (.toDate input)))
|
||||
org.joda.time.base.AbstractPartial
|
||||
(to-db-object [^AbstractPartial input]
|
||||
(to-db-object (.toDate input))))
|
||||
|
||||
(extend-protocol ConvertFromDBObject
|
||||
java.util.Date
|
||||
(from-db-object [^java.util.Date input keywordize]
|
||||
(org.joda.time.DateTime. input)))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; Reader extensions
|
||||
;;
|
||||
|
||||
(defmethod print-dup org.joda.time.base.AbstractInstant
|
||||
[^org.joda.time.base.AbstractInstant d out]
|
||||
(print-dup (.toDate d) out))
|
||||
|
||||
|
||||
(defmethod print-dup org.joda.time.base.AbstractPartial
|
||||
[^org.joda.time.base.AbstractPartial d out]
|
||||
(print-dup (.toDate d) out))
|
||||
|
||||
;;
|
||||
;; JSON serialization
|
||||
;;
|
||||
|
||||
(require 'clojurewerkz.support.json)
|
||||
|
|
@ -1,50 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.js
|
||||
"Kept for backwards compatibility. Please use clojurewerkz.support.js from now on."
|
||||
(:require [clojurewerkz.support.js :as js]))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn load-resource
|
||||
"Loads a JavaScript resource (file from the classpath) and returns its content as a string.
|
||||
The .js suffix at the end may be omitted.
|
||||
|
||||
Used primarily for map/reduce queries."
|
||||
(^String [^String path]
|
||||
(js/load-resource path)))
|
||||
|
|
@ -1,116 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.json
|
||||
"Provides clojure.data.json/Write-JSON protocol extension for MongoDB-specific types, such as
|
||||
org.bson.types.ObjectId"
|
||||
(:import org.bson.types.ObjectId
|
||||
org.bson.types.BSONTimestamp))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
;; copied from clojure.reducers
|
||||
(defmacro ^:private compile-if
|
||||
"Evaluate `exp` and if it returns logical true and doesn't error, expand to
|
||||
`then`. Else expand to `else`.
|
||||
|
||||
(compile-if (Class/forName \"java.util.concurrent.ForkJoinTask\")
|
||||
(do-cool-stuff-with-fork-join)
|
||||
(fall-back-to-executor-services))"
|
||||
[exp then else]
|
||||
(if (try (eval exp)
|
||||
(catch Throwable _ false))
|
||||
`(do ~then)
|
||||
`(do ~else)))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(require 'clojurewerkz.support.json)
|
||||
|
||||
;; all this madness would not be necessary if some people cared about backwards
|
||||
;; compatiblity of the libraries they maintain. Shame on the clojure.data.json maintainer. MK.
|
||||
(compile-if (and (find-ns 'clojure.data.json)
|
||||
clojure.data.json/JSONWriter)
|
||||
(try
|
||||
(extend-protocol clojure.data.json/JSONWriter
|
||||
ObjectId
|
||||
(-write
|
||||
([^ObjectId object out]
|
||||
(clojure.data.json/write (.toString object) out))
|
||||
([^ObjectId object out options]
|
||||
(clojure.data.json/write (.toString object) out options))))
|
||||
|
||||
(extend-protocol clojure.data.json/JSONWriter
|
||||
BSONTimestamp
|
||||
(-write
|
||||
([^BSONTimestamp object out]
|
||||
(clojure.data.json/write {:time (.getTime object) :inc (.getInc object)} out))
|
||||
([^BSONTimestamp object out options]
|
||||
(clojure.data.json/write {:time (.getTime object) :inc (.getInc object)} out options))))
|
||||
|
||||
(catch Throwable _
|
||||
false))
|
||||
(comment "Nothing to do, clojure.data.json is not available"))
|
||||
|
||||
(compile-if (and (find-ns 'clojure.data.json)
|
||||
clojure.data.json/Write-JSON)
|
||||
(try
|
||||
(extend-protocol clojure.data.json/Write-JSON
|
||||
ObjectId
|
||||
(write-json [^ObjectId object out escape-unicode?]
|
||||
(clojure.data.json/write-json (.toString object) out escape-unicode?)))
|
||||
(catch Throwable _
|
||||
false))
|
||||
(comment "Nothing to do, clojure.data.json 0.1.x is not available"))
|
||||
|
||||
|
||||
(try
|
||||
(require 'cheshire.generate)
|
||||
(catch Throwable t
|
||||
false))
|
||||
|
||||
(try
|
||||
(cheshire.generate/add-encoder ObjectId
|
||||
(fn [^ObjectId oid ^com.fasterxml.jackson.core.json.WriterBasedJsonGenerator generator]
|
||||
(.writeString generator (.toString oid))))
|
||||
(cheshire.generate/add-encoder BSONTimestamp
|
||||
(fn [^BSONTimestamp ts ^com.fasterxml.jackson.core.json.WriterBasedJsonGenerator generator]
|
||||
(cheshire.generate/encode-map {:time (.getTime ts) :inc (.getInc ts)} generator)))
|
||||
(catch Throwable t
|
||||
false))
|
||||
|
|
@ -1,459 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.operators
|
||||
"Provides vars that represent various MongoDB operators, for example, $gt or $in or $regex.
|
||||
They can be passed in queries as strings but using vars from this namespace makes the code
|
||||
a bit cleaner and closer to what you would see in a MongoDB shell query.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/querying.html")
|
||||
|
||||
(defmacro ^{:private true} defoperator
|
||||
[operator]
|
||||
`(def ^{:const true} ~(symbol (str operator)) ~(str operator)))
|
||||
|
||||
;;
|
||||
;; QUERY OPERATORS
|
||||
;;
|
||||
|
||||
(declare $gt $gte $lt $lte $all $in $nin $eq $ne $elemMatch $regex $options)
|
||||
|
||||
;; $gt is "greater than" comparator
|
||||
;; $gte is "greater than or equals" comparator
|
||||
;; $gt is "less than" comparator
|
||||
;; $lte is "less than or equals" comparator
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find "libraries" { :users { $gt 10 } })
|
||||
;; (monger.collection/find "libraries" { :users { $gte 10 } })
|
||||
;; (monger.collection/find "libraries" { :users { $lt 10 } })
|
||||
;; (monger.collection/find "libraries" { :users { $lte 10 } })
|
||||
(defoperator $gt)
|
||||
(defoperator $gte)
|
||||
(defoperator $lt)
|
||||
(defoperator $lte)
|
||||
|
||||
;; $all matches all values in the array
|
||||
;;
|
||||
;; EXAMPLES
|
||||
;; (mgcol/find-maps "languages" { :tags { $all [ "functional" "object-oriented" ] } } )
|
||||
(defoperator $all)
|
||||
|
||||
;; The $in operator is analogous to the SQL IN modifier, allowing you to specify an array of possible matches.
|
||||
;;
|
||||
;; EXAMPLES
|
||||
;; (mgcol/find-maps "languages" { :tags { $in [ "functional" "object-oriented" ] } } )
|
||||
(defoperator $in)
|
||||
|
||||
;; The $nin operator is similar to $in, but it selects objects for which the specified field does not
|
||||
;; have any value in the specified array.
|
||||
;;
|
||||
;; EXAMPLES
|
||||
;; (mgcol/find-maps "languages" { :tags { $nin [ "functional" ] } } )
|
||||
(defoperator $nin)
|
||||
|
||||
;; $eq is "equals" comparator
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find "libraries" { :language { $eq "Clojure" }})
|
||||
(defoperator $eq)
|
||||
|
||||
;; $ne is "non-equals" comparator
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find "libraries" { :language { $ne "Clojure" }})
|
||||
(defoperator $ne)
|
||||
|
||||
;; $elemMatch checks if an element in an array matches the specified expression
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; ;; Matches element with :text "Nice" and :rating greater than or equal 1
|
||||
;; (monger.collection/find "comments" { $elemMatch { :text "Nice!" :rating { $gte 1 } } })
|
||||
(defoperator $elemMatch)
|
||||
|
||||
(defoperator $regex)
|
||||
(defoperator $options)
|
||||
|
||||
;; comment on a query predicate
|
||||
|
||||
(declare $comment $explain $hint $maxTimeMS $orderBy $query $returnKey $showDiskLoc $natural)
|
||||
|
||||
(defoperator $comment)
|
||||
(defoperator $explain)
|
||||
(defoperator $hint)
|
||||
(defoperator $maxTimeMS)
|
||||
(defoperator $orderBy)
|
||||
(defoperator $query)
|
||||
(defoperator $returnKey)
|
||||
(defoperator $showDiskLoc)
|
||||
(defoperator $natural)
|
||||
|
||||
|
||||
;;
|
||||
;; EVALUATION (QUERY)
|
||||
;;
|
||||
|
||||
(declare $expr $jsonSchema $where $and $or $nor)
|
||||
|
||||
(defoperator $expr)
|
||||
(defoperator $jsonSchema)
|
||||
|
||||
;; Matches documents that satisfy a JavaScript expression.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; (monger.collection/find "people" { $where "this.placeOfBirth === this.address.city" })
|
||||
(defoperator $where)
|
||||
|
||||
;;
|
||||
;; LOGIC OPERATORS
|
||||
;;
|
||||
|
||||
;; $and lets you use a boolean and in the query. Logical and means that all the given expressions should be true for positive match.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; ;; Matches all libraries where :language is "Clojure" and :users is greater than 10
|
||||
;; (monger.collection/find "libraries" { $and [{ :language "Clojure" } { :users { $gt 10 } }] })
|
||||
(defoperator $and)
|
||||
|
||||
;; $or lets you use a boolean or in the query. Logical or means that one of the given expressions should be true for positive match.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; ;; Matches all libraries whose :name is "mongoid" or :language is "Ruby"
|
||||
;; (monger.collection.find "libraries" { $or [ { :name "mongoid" } { :language "Ruby" } ] })
|
||||
(defoperator $or)
|
||||
|
||||
;; @nor lets you use a boolean expression, opposite to "all" in the query (think: neither). Give $nor a list of expressions, all of which should
|
||||
;; be false for positive match.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; (monger.collection/find "libraries" { $nor [{ :language "Clojure" } {:users { $gt 10 } } ]})
|
||||
(defoperator $nor)
|
||||
|
||||
;;
|
||||
;; ATOMIC MODIFIERS
|
||||
;;
|
||||
|
||||
(declare $inc $mul $set $unset $setOnInsert $rename $push $position $each $addToSet $pop $pull $pullAll $bit $bitsAllClear $bitsAllSet $bitsAnyClear $bitsAnySet $exists $mod $size $type $not)
|
||||
|
||||
;; $inc increments one or many fields for the given value, otherwise sets the field to value
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "scores" { :_id user-id } { :score 10 } })
|
||||
;; (monger.collection/update "scores" { :_id user-id } { :score 20 :bonus 10 } })
|
||||
(defoperator $inc)
|
||||
|
||||
(defoperator $mul)
|
||||
|
||||
;; $set sets an existing (or non-existing) field (or set of fields) to value
|
||||
;; $set supports all datatypes.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "things" { :_id oid } { $set { :weight 20.5 } })
|
||||
;; (monger.collection/update "things" { :_id oid } { $set { :weight 20.5 :height 12.5 } })
|
||||
(defoperator $set)
|
||||
|
||||
;; $unset deletes a given field, non-existing fields are ignored.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "things" { :_id oid } { $unset { :weight 1 } })
|
||||
(defoperator $unset)
|
||||
|
||||
;; $setOnInsert assigns values to fields during an upsert only when using the upsert option to the update operation performs an insert.
|
||||
;; New in version 2.4. http://docs.mongodb.org/manual/reference/operator/setOnInsert/
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find-and-modify "things" {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} :upsert true)
|
||||
(defoperator $setOnInsert)
|
||||
|
||||
;; $rename renames a given field
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "things" { :_id oid } { $rename { :old_field_name "new_field_name" } })
|
||||
(defoperator $rename)
|
||||
|
||||
;; $push appends _single_ value to field, if field is an existing array, otherwise sets field to the array [value] if field is not present.
|
||||
;; If field is present but is not an array, an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update "docs" { :_id oid } { $push { :tags "modifiers" } })
|
||||
(defoperator $push)
|
||||
|
||||
;; $position modifies the behavior of $push per https://docs.mongodb.com/manual/reference/operator/update/position/
|
||||
(defoperator $position)
|
||||
|
||||
;; $each is a modifier for the $push and $addToSet operators for appending multiple values to an array field.
|
||||
;; Without the $each modifier $push and $addToSet will append an array as a single value.
|
||||
;; MongoDB 2.4 adds support for the $each modifier to the $push operator.
|
||||
;; In MongoDB 2.2 the $each modifier can only be used with the $addToSet operator.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $push { :tags { $each ["mongodb" "docs"] } } })
|
||||
(defoperator $each)
|
||||
|
||||
;; $addToSet Adds value to the array only if its not in the array already, if field is an existing array, otherwise sets field to the
|
||||
;; array value if field is not present. If field is present but is not an array, an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(defoperator $addToSet)
|
||||
|
||||
;; $pop removes the last element in an array, if 1 is passed.
|
||||
;; if -1 is passed, removes the first element in an array
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pop { :tags 1 } })
|
||||
;; (mgcol/update coll { :_id oid } { $pop { :tags 1 :categories 1 } })
|
||||
(defoperator $pop)
|
||||
|
||||
;; $pull removes all occurrences of value from field, if field is an array. If field is present but is not an array, an error condition
|
||||
;; is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pull { :measurements 1.2 } })
|
||||
(defoperator $pull)
|
||||
|
||||
;; $pullAll removes all occurrences of each value in value_array from field, if field is an array. If field is present but is not an array
|
||||
;; an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pullAll { :measurements 1.2 } })
|
||||
;; (mgcol/update coll { :_id oid } { $pullAll { :measurements { $gte 1.2 } } })
|
||||
(defoperator $pullAll)
|
||||
|
||||
(defoperator $bit)
|
||||
(defoperator $bitsAllClear)
|
||||
(defoperator $bitsAllSet)
|
||||
(defoperator $bitsAnyClear)
|
||||
(defoperator $bitsAnySet)
|
||||
|
||||
(defoperator $exists)
|
||||
(defoperator $mod)
|
||||
(defoperator $size)
|
||||
(defoperator $type)
|
||||
(defoperator $not)
|
||||
|
||||
|
||||
;;
|
||||
;; Aggregation in 4.2
|
||||
;;
|
||||
|
||||
(declare $addFields $bucket $bucketAuto $collStats $facet $geoNear $graphLookup $indexStats $listSessions $lookup $match $merge $out $planCacheStats $project $redact $replaceRoot $replaceWith $sample $limit $skip $unwind $group $sort $sortByCount $currentOp $listLocalSessions $cmp $min $max $avg $stdDevPop $stdDevSamp $sum $let $first $last $abs $add $ceil $divide $exp $floor $ln $log $log10 $multiply $pow $round $sqrt $subtract $trunc $literal $arrayElemAt $arrayToObject $concatArrays $filter $indexOfArray $isArray $map $objectToArray $range $reduce $reverseArray $zip $mergeObjects $allElementsTrue $anyElementsTrue $setDifference $setEquals $setIntersection $setIsSubset $setUnion $strcasecmp $substr $substrBytes $substrCP $toLower $toString $toUpper $concat $indexOfBytes $indexOfCP $ltrim $regexFind $regexFindAll $regexMatch $rtrim $split $strLenBytes $subLenCP $trim $sin $cos $tan $asin $acos $atan $atan2 $asinh $acosh $atanh $radiansToDegrees $degreesToRadians $convert $toBool $toDecimal $toDouble $toInt $toLong $toObjectId $dayOfMonth $dayOfWeek $dayOfYear $hour $minute $month $second $millisecond $week $year $isoDate $dateFromParts $dateFromString $dateToParts $dateToString $isoDayOfWeek $isoWeek $isoWeekYear $toDate $ifNull $cond $switch)
|
||||
|
||||
(defoperator $addFields)
|
||||
(defoperator $bucket)
|
||||
(defoperator $bucketAuto)
|
||||
(defoperator $collStats)
|
||||
(defoperator $facet)
|
||||
(defoperator $geoNear)
|
||||
(defoperator $graphLookup)
|
||||
(defoperator $indexStats)
|
||||
(defoperator $listSessions)
|
||||
(defoperator $lookup)
|
||||
(defoperator $match)
|
||||
(defoperator $merge)
|
||||
(defoperator $out)
|
||||
(defoperator $planCacheStats)
|
||||
(defoperator $project)
|
||||
(defoperator $redact)
|
||||
(defoperator $replaceRoot)
|
||||
(defoperator $replaceWith)
|
||||
(defoperator $sample)
|
||||
(defoperator $limit)
|
||||
(defoperator $skip)
|
||||
(defoperator $unwind)
|
||||
(defoperator $group)
|
||||
(defoperator $sort)
|
||||
(defoperator $sortByCount)
|
||||
|
||||
(defoperator $currentOp)
|
||||
(defoperator $listLocalSessions)
|
||||
|
||||
(defoperator $cmp)
|
||||
|
||||
(defoperator $min)
|
||||
(defoperator $max)
|
||||
(defoperator $avg)
|
||||
(defoperator $stdDevPop)
|
||||
(defoperator $stdDevSamp)
|
||||
(defoperator $sum)
|
||||
(defoperator $let)
|
||||
|
||||
(defoperator $first)
|
||||
(defoperator $last)
|
||||
|
||||
(defoperator $abs)
|
||||
(defoperator $add)
|
||||
(defoperator $ceil)
|
||||
(defoperator $divide)
|
||||
(defoperator $exp)
|
||||
(defoperator $floor)
|
||||
(defoperator $ln)
|
||||
(defoperator $log)
|
||||
(defoperator $log10)
|
||||
(defoperator $multiply)
|
||||
(defoperator $pow)
|
||||
(defoperator $round)
|
||||
(defoperator $sqrt)
|
||||
(defoperator $subtract)
|
||||
(defoperator $trunc)
|
||||
(defoperator $literal)
|
||||
|
||||
(defoperator $arrayElemAt)
|
||||
(defoperator $arrayToObject)
|
||||
(defoperator $concatArrays)
|
||||
(defoperator $filter)
|
||||
(defoperator $indexOfArray)
|
||||
(defoperator $isArray)
|
||||
(defoperator $map)
|
||||
(defoperator $objectToArray)
|
||||
(defoperator $range)
|
||||
(defoperator $reduce)
|
||||
(defoperator $reverseArray)
|
||||
(defoperator $zip)
|
||||
(defoperator $mergeObjects)
|
||||
|
||||
(defoperator $allElementsTrue)
|
||||
(defoperator $anyElementsTrue)
|
||||
(defoperator $setDifference)
|
||||
(defoperator $setEquals)
|
||||
(defoperator $setIntersection)
|
||||
(defoperator $setIsSubset)
|
||||
(defoperator $setUnion)
|
||||
|
||||
(defoperator $strcasecmp)
|
||||
(defoperator $substr)
|
||||
(defoperator $substrBytes)
|
||||
(defoperator $substrCP)
|
||||
(defoperator $toLower)
|
||||
(defoperator $toString)
|
||||
(defoperator $toUpper)
|
||||
(defoperator $concat)
|
||||
(defoperator $indexOfBytes)
|
||||
(defoperator $indexOfCP)
|
||||
(defoperator $ltrim)
|
||||
(defoperator $regexFind)
|
||||
(defoperator $regexFindAll)
|
||||
(defoperator $regexMatch)
|
||||
(defoperator $rtrim)
|
||||
(defoperator $split)
|
||||
(defoperator $strLenBytes)
|
||||
(defoperator $subLenCP)
|
||||
(defoperator $trim)
|
||||
|
||||
(defoperator $sin)
|
||||
(defoperator $cos)
|
||||
(defoperator $tan)
|
||||
(defoperator $asin)
|
||||
(defoperator $acos)
|
||||
(defoperator $atan)
|
||||
(defoperator $atan2)
|
||||
(defoperator $asinh)
|
||||
(defoperator $acosh)
|
||||
(defoperator $atanh)
|
||||
(defoperator $radiansToDegrees)
|
||||
(defoperator $degreesToRadians)
|
||||
|
||||
(defoperator $convert)
|
||||
(defoperator $toBool)
|
||||
(defoperator $toDecimal)
|
||||
(defoperator $toDouble)
|
||||
(defoperator $toInt)
|
||||
(defoperator $toLong)
|
||||
(defoperator $toObjectId)
|
||||
|
||||
(defoperator $dayOfMonth)
|
||||
(defoperator $dayOfWeek)
|
||||
(defoperator $dayOfYear)
|
||||
(defoperator $hour)
|
||||
(defoperator $minute)
|
||||
(defoperator $month)
|
||||
(defoperator $second)
|
||||
(defoperator $millisecond)
|
||||
(defoperator $week)
|
||||
(defoperator $year)
|
||||
(defoperator $isoDate)
|
||||
(defoperator $dateFromParts)
|
||||
(defoperator $dateFromString)
|
||||
(defoperator $dateToParts)
|
||||
(defoperator $dateToString)
|
||||
(defoperator $isoDayOfWeek)
|
||||
(defoperator $isoWeek)
|
||||
(defoperator $isoWeekYear)
|
||||
(defoperator $toDate)
|
||||
|
||||
|
||||
(defoperator $ifNull)
|
||||
(defoperator $cond)
|
||||
(defoperator $switch)
|
||||
|
||||
;; Geospatial
|
||||
(declare $geoWithin $geoIntersects $near $nearSphere $geometry $maxDistance $minDistance $center $centerSphere $box $polygon $slice)
|
||||
(defoperator $geoWithin)
|
||||
(defoperator $geoIntersects)
|
||||
(defoperator $near)
|
||||
(defoperator $nearSphere)
|
||||
(defoperator $geometry)
|
||||
(defoperator $maxDistance)
|
||||
(defoperator $minDistance)
|
||||
(defoperator $center)
|
||||
(defoperator $centerSphere)
|
||||
(defoperator $box)
|
||||
(defoperator $polygon)
|
||||
|
||||
(defoperator $slice)
|
||||
|
||||
;; full text search
|
||||
(declare $text $meta $search $language $natural $currentDate $isolated $count)
|
||||
(defoperator $text)
|
||||
(defoperator $meta)
|
||||
(defoperator $search)
|
||||
(defoperator $language)
|
||||
(defoperator $natural)
|
||||
|
||||
;; $currentDate operator sets the value of a field to the current date, either as a Date or a timestamp. The default type is Date.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $currentDate { :lastModified true } })
|
||||
(defoperator $currentDate)
|
||||
|
||||
;; Isolates intermediate multi-document updates from other clients.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update "libraries" { :language "Clojure", $isolated 1 } { $inc { :popularity 1 } } {:multi true})
|
||||
(defoperator $isolated)
|
||||
|
||||
(defoperator $count)
|
||||
|
|
@ -1,189 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.query
|
||||
"Provides an expressive Query DSL that is very close to that in the Mongo shell (within reason).
|
||||
This is the most flexible and recommended way to query with Monger. Queries can be composed, like in Korma.
|
||||
|
||||
Related documentation guide: http://clojuremongodb.info/articles/querying.html"
|
||||
(:refer-clojure :exclude [select find sort])
|
||||
(:require [monger.core]
|
||||
[monger.internal pagination]
|
||||
[monger.cursor :as cursor :refer [add-options]]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.operators :refer :all])
|
||||
(:import [com.mongodb DB DBCollection DBObject DBCursor ReadPreference]
|
||||
[java.util.concurrent TimeUnit]
|
||||
java.util.List))
|
||||
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
;;
|
||||
;; Cursor/chain methods
|
||||
;;
|
||||
;; Monger query is an auxiliary construction that helps to create funciton chains through cursors.
|
||||
;; You can specify several chained actions that will be performed on the certain collection through
|
||||
;; query fields.
|
||||
;;
|
||||
;; Existing query fields:
|
||||
;;
|
||||
;; :fields - selects which fields are returned. The default is all fields. _id is included by default.
|
||||
;; :sort - adds a sort to the query.
|
||||
;; :fields - set of fields to retrieve during query execution
|
||||
;; :skip - Skips the first N results.
|
||||
;; :limit - Returns a maximum of N results.
|
||||
;; :batch-size - limits the nubmer of elements returned in one batch.
|
||||
;; :snapshot - sses snapshot mode for the query. Snapshot mode assures no duplicates are returned, or objects missed
|
||||
;; which were present at both the start and end of the query's execution (if an object is new during the query, or
|
||||
;; deleted during the query, it may or may not be returned, even with snapshot mode). Note that short query responses
|
||||
;; (less than 1MB) are always effectively snapshotted. Currently, snapshot mode may not be used with sorting or explicit hints.
|
||||
(defn empty-query
|
||||
([]
|
||||
{
|
||||
:query {}
|
||||
:sort {}
|
||||
:fields []
|
||||
:skip 0
|
||||
:limit 0
|
||||
:batch-size 256
|
||||
:snapshot false
|
||||
:keywordize-fields true
|
||||
})
|
||||
([^DBCollection coll]
|
||||
(merge (empty-query) { :collection coll })))
|
||||
|
||||
(defn exec
|
||||
[{:keys [^DBCollection collection
|
||||
query
|
||||
fields
|
||||
skip
|
||||
limit
|
||||
sort
|
||||
batch-size
|
||||
hint
|
||||
snapshot
|
||||
read-preference
|
||||
keywordize-fields
|
||||
max-time
|
||||
options]
|
||||
:or { limit 0 batch-size 256 skip 0 } }]
|
||||
(with-open [cursor (doto (.find collection (to-db-object query) (as-field-selector fields))
|
||||
(.limit limit)
|
||||
(.skip skip)
|
||||
(.sort (to-db-object sort))
|
||||
(.batchSize batch-size))]
|
||||
(when snapshot
|
||||
(.snapshot cursor))
|
||||
(when hint
|
||||
(.hint cursor (to-db-object hint)))
|
||||
(when read-preference
|
||||
(.setReadPreference cursor read-preference))
|
||||
(when max-time
|
||||
(.maxTime cursor max-time TimeUnit/MILLISECONDS))
|
||||
(when options
|
||||
(add-options cursor options))
|
||||
(map (fn [x] (from-db-object x keywordize-fields))
|
||||
cursor)))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn find
|
||||
[m query]
|
||||
(merge m { :query query }))
|
||||
|
||||
(defn fields
|
||||
[m flds]
|
||||
(merge m { :fields flds }))
|
||||
|
||||
(defn sort
|
||||
[m srt]
|
||||
(merge m { :sort srt }))
|
||||
|
||||
(defn skip
|
||||
[m ^long n]
|
||||
(merge m { :skip n }))
|
||||
|
||||
(defn limit
|
||||
[m ^long n]
|
||||
(merge m { :limit n }))
|
||||
|
||||
(defn batch-size
|
||||
[m ^long n]
|
||||
(merge m { :batch-size n }))
|
||||
|
||||
(defn hint
|
||||
[m h]
|
||||
(merge m { :hint h }))
|
||||
|
||||
(defn snapshot
|
||||
[m]
|
||||
(merge m { :snapshot true }))
|
||||
|
||||
(defn read-preference
|
||||
[m ^ReadPreference rp]
|
||||
(merge m { :read-preference rp }))
|
||||
|
||||
(defn max-time
|
||||
[m ^long max-time]
|
||||
(merge m { :max-time max-time }))
|
||||
|
||||
(defn options
|
||||
[m opts]
|
||||
(merge m { :options opts }))
|
||||
|
||||
(defn keywordize-fields
|
||||
[m bool]
|
||||
(merge m { :keywordize-fields bool }))
|
||||
|
||||
(defn paginate
|
||||
[m & { :keys [page per-page] :or { page 1 per-page 10 } }]
|
||||
(merge m { :limit per-page :skip (monger.internal.pagination/offset-for page per-page) }))
|
||||
|
||||
(defmacro with-collection
|
||||
[db coll & body]
|
||||
`(let [coll# ~coll
|
||||
^DB db# ~db
|
||||
db-coll# (if (string? coll#)
|
||||
(.getCollection db# coll#)
|
||||
coll#)
|
||||
query# (-> (empty-query db-coll#) ~@body)]
|
||||
(exec query#)))
|
||||
|
||||
(defmacro partial-query
|
||||
[& body]
|
||||
`(-> {} ~@body))
|
||||
|
|
@ -1,65 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.ragtime
|
||||
"Ragtime integration"
|
||||
(:refer-clojure :exclude [find sort])
|
||||
(:require [ragtime.protocols :as proto]
|
||||
[monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.query :refer [with-collection find sort]])
|
||||
(:import java.util.Date
|
||||
[com.mongodb DB WriteConcern]))
|
||||
|
||||
|
||||
(def ^{:const true}
|
||||
migrations-collection "meta.migrations")
|
||||
|
||||
|
||||
(extend-type com.mongodb.DB
|
||||
proto/DataStore
|
||||
(add-migration-id [db id]
|
||||
(mc/insert db migrations-collection {:_id id :created_at (Date.)} WriteConcern/FSYNC_SAFE))
|
||||
(remove-migration-id [db id]
|
||||
(mc/remove-by-id db migrations-collection id))
|
||||
(applied-migration-ids [db]
|
||||
(let [xs (with-collection db migrations-collection
|
||||
(find {})
|
||||
(sort {:created_at 1}))]
|
||||
(vec (map :_id xs)))))
|
||||
|
||||
|
||||
(defn flush-migrations!
|
||||
"REMOVES all the information about previously performed migrations"
|
||||
[^DB db]
|
||||
(mc/remove db migrations-collection))
|
||||
|
|
@ -1,72 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.result
|
||||
"Provides functions that determine if a query (or other database operation)
|
||||
was successful or not.
|
||||
|
||||
Related documentation guides:
|
||||
|
||||
* http://clojuremongodb.info/articles/inserting.html
|
||||
* http://clojuremongodb.info/articles/updating.html
|
||||
* http://clojuremongodb.info/articles/commands.html
|
||||
* http://clojuremongodb.info/articles/mapreduce.html"
|
||||
(:import [com.mongodb WriteResult CommandResult])
|
||||
(:require monger.conversion))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defprotocol WriteResultPredicates
|
||||
(acknowledged? [input] "Returns true if write result is a success")
|
||||
(updated-existing? [input] "Returns true if write result has updated an existing document"))
|
||||
|
||||
(extend-protocol WriteResultPredicates
|
||||
WriteResult
|
||||
(acknowledged?
|
||||
[^WriteResult result]
|
||||
(.wasAcknowledged result))
|
||||
(updated-existing?
|
||||
[^WriteResult result]
|
||||
(.isUpdateOfExisting result))
|
||||
|
||||
CommandResult
|
||||
(acknowledged?
|
||||
[^CommandResult result]
|
||||
(.ok result)))
|
||||
|
||||
(defn affected-count
|
||||
"Get the number of documents affected"
|
||||
[^WriteResult result]
|
||||
(.getN result))
|
||||
|
|
@ -1,135 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns monger.ring.session-store
|
||||
(:require [ring.middleware.session.store :as ringstore]
|
||||
[monger.collection :as mc]
|
||||
[monger.core :as mg]
|
||||
[monger.conversion :refer :all])
|
||||
(:import [java.util UUID Date]
|
||||
[com.mongodb DB]
|
||||
ring.middleware.session.store.SessionStore))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(def ^{:const true}
|
||||
default-session-store-collection "web_sessions")
|
||||
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
;; this session store stores Clojure data structures using Clojure reader. It will correctly store every
|
||||
;; data structure Clojure reader can serialize and read but won't make the data useful to applications
|
||||
;; in other languages.
|
||||
|
||||
(defrecord ClojureReaderBasedMongoDBSessionStore [^DB db ^String collection-name])
|
||||
|
||||
(defmethod print-dup java.util.Date
|
||||
[^java.util.Date d ^java.io.Writer out]
|
||||
(.write out
|
||||
(str "#="
|
||||
`(java.util.Date. ~(.getYear d)
|
||||
~(.getMonth d)
|
||||
~(.getDate d)
|
||||
~(.getHours d)
|
||||
~(.getMinutes d)
|
||||
~(.getSeconds d)))))
|
||||
|
||||
(defmethod print-dup org.bson.types.ObjectId
|
||||
[oid ^java.io.Writer out]
|
||||
(.write out
|
||||
(str "#="
|
||||
`(org.bson.types.ObjectId. ~(str oid)))))
|
||||
|
||||
|
||||
(extend-protocol ringstore/SessionStore
|
||||
ClojureReaderBasedMongoDBSessionStore
|
||||
|
||||
(read-session [store key]
|
||||
(if key
|
||||
(if-let [m (mc/find-one-as-map (.db store) (.collection-name store) {:_id key})]
|
||||
(read-string (:value m))
|
||||
{})
|
||||
{}))
|
||||
|
||||
(write-session [store key data]
|
||||
(let [date (Date.)
|
||||
key (or key (str (UUID/randomUUID)))
|
||||
value (binding [*print-dup* true]
|
||||
(pr-str (assoc data :_id key)))]
|
||||
(mc/save (.db store) (.collection-name store) {:_id key :value value :date date})
|
||||
key))
|
||||
|
||||
(delete-session [store key]
|
||||
(mc/remove-by-id (.db store) (.collection-name store) key)
|
||||
nil))
|
||||
|
||||
|
||||
(defn session-store
|
||||
[^DB db ^String s]
|
||||
(ClojureReaderBasedMongoDBSessionStore. db s))
|
||||
|
||||
|
||||
;; this session store won't store namespaced keywords correctly but stores results in a way
|
||||
;; that applications in other languages can read. DO NOT use it with Friend.
|
||||
|
||||
(defrecord MongoDBSessionStore [^DB db ^String collection-name])
|
||||
|
||||
(extend-protocol ringstore/SessionStore
|
||||
MongoDBSessionStore
|
||||
|
||||
(read-session [store key]
|
||||
(if-let [m (and key
|
||||
(mc/find-one-as-map (.db store) (.collection-name store) {:_id key}))]
|
||||
m
|
||||
{}))
|
||||
|
||||
(write-session [store key data]
|
||||
(let [key (or key (str (UUID/randomUUID)))]
|
||||
(mc/save (.db store) (.collection-name store) (assoc data :date (Date.) :_id key))
|
||||
key))
|
||||
|
||||
(delete-session [store key]
|
||||
(mc/remove-by-id (.db store) (.collection-name store) key)
|
||||
nil))
|
||||
|
||||
|
||||
(defn monger-store
|
||||
[^DB db ^String s]
|
||||
(MongoDBSessionStore. db s))
|
||||
|
|
@ -1,82 +0,0 @@
|
|||
;; This source code is dual-licensed under the Apache License, version
|
||||
;; 2.0, and the Eclipse Public License, version 1.0.
|
||||
;;
|
||||
;; The APL v2.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team
|
||||
;;
|
||||
;; Licensed under the Apache License, Version 2.0 (the "License");
|
||||
;; you may not use this file except in compliance with the License.
|
||||
;; You may obtain a copy of the License at
|
||||
;;
|
||||
;; http://www.apache.org/licenses/LICENSE-2.0
|
||||
;;
|
||||
;; Unless required by applicable law or agreed to in writing, software
|
||||
;; distributed under the License is distributed on an "AS IS" BASIS,
|
||||
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
;; See the License for the specific language governing permissions and
|
||||
;; limitations under the License.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;;
|
||||
;; The EPL v1.0:
|
||||
;;
|
||||
;; ----------------------------------------------------------------------------------
|
||||
;; Copyright (c) 2011-2018 Michael S. Klishin, Alex Petrov, and the ClojureWerkz Team.
|
||||
;; All rights reserved.
|
||||
;;
|
||||
;; This program and the accompanying materials are made available under the terms of
|
||||
;; the Eclipse Public License Version 1.0,
|
||||
;; which accompanies this distribution and is available at
|
||||
;; http://www.eclipse.org/legal/epl-v10.html.
|
||||
;; ----------------------------------------------------------------------------------
|
||||
|
||||
(ns ^{:doc "Provides various utility functions, primarily for working with document ids."} monger.util
|
||||
(:refer-clojure :exclude [random-uuid])
|
||||
(:import java.security.SecureRandom
|
||||
java.math.BigInteger
|
||||
org.bson.types.ObjectId
|
||||
com.mongodb.DBObject
|
||||
clojure.lang.IPersistentMap
|
||||
java.util.Map)
|
||||
(:refer-clojure :exclude [random-uuid]))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^String random-uuid
|
||||
"Generates a secure random UUID string"
|
||||
[]
|
||||
(.toString (java.util.UUID/randomUUID)))
|
||||
|
||||
(defn ^String random-str
|
||||
"Generates a secure random string"
|
||||
[^long n, ^long num-base]
|
||||
(.toString (new BigInteger n (SecureRandom.)) num-base))
|
||||
|
||||
(defn ^ObjectId object-id
|
||||
"Returns a new BSON object id, or converts str to BSON object id"
|
||||
([]
|
||||
(ObjectId.))
|
||||
([^String s]
|
||||
(ObjectId. s)))
|
||||
|
||||
(defprotocol GetDocumentId
|
||||
(get-id [input] "Returns document id"))
|
||||
|
||||
(extend-protocol GetDocumentId
|
||||
DBObject
|
||||
(get-id
|
||||
[^DBObject object]
|
||||
(.get object "_id"))
|
||||
|
||||
IPersistentMap
|
||||
(get-id
|
||||
[^IPersistentMap object]
|
||||
(or (:_id object) (object "_id"))))
|
||||
|
||||
(defn into-array-list
|
||||
"Coerce a j.u.Collection into a j.u.ArrayList."
|
||||
^java.util.ArrayList [^java.util.Collection coll]
|
||||
(java.util.ArrayList. coll))
|
||||
545
src/monger/collection.clj
Normal file
545
src/monger/collection.clj
Normal file
|
|
@ -0,0 +1,545 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.collection
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty?])
|
||||
(:import [com.mongodb Mongo DB DBCollection WriteResult DBObject WriteConcern DBCursor MapReduceCommand MapReduceCommand$OutputType]
|
||||
[java.util List Map]
|
||||
[clojure.lang IPersistentMap ISeq]
|
||||
[org.bson.types ObjectId])
|
||||
(:require [monger core result])
|
||||
(:use [monger.conversion]))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(defn- fields-to-db-object
|
||||
[^List fields]
|
||||
(zipmap fields (repeat 1)))
|
||||
|
||||
(definline check-not-nil!
|
||||
[ref ^String message]
|
||||
`(when (nil? ~ref)
|
||||
(throw (IllegalArgumentException. ~message))))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
;;
|
||||
;; monger.collection/insert
|
||||
;;
|
||||
|
||||
(defn ^WriteResult insert
|
||||
"Saves @document@ to @collection@. You can optionally specify WriteConcern.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
(monger.collection/insert \"people\" { :name \"Joe\", :age 30 })
|
||||
|
||||
(monger.collection/insert \"people\" { :name \"Joe\", :age 30, WriteConcern/SAFE })
|
||||
"
|
||||
([^String collection ^DBObject document]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.insert ^DBCollection coll ^DBObject (to-db-object document) ^WriteConcern monger.core/*mongodb-write-concern*)))
|
||||
([^String collection ^DBObject document ^WriteConcern concern]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.insert ^DBCollection coll ^DBObject (to-db-object document) ^WriteConcern concern)))
|
||||
([^DB db ^String collection ^DBObject document ^WriteConcern concern]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.insert ^DBCollection coll ^DBObject (to-db-object document) ^WriteConcern concern))))
|
||||
|
||||
|
||||
(defn ^WriteResult insert-batch
|
||||
"Saves @documents@ do @collection@. You can optionally specify WriteConcern as a third argument.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
(monger.collection/insert-batch \"people\" [{ :name \"Joe\", :age 30 }, { :name \"Paul\", :age 27 }])
|
||||
|
||||
(monger.collection/insert-batch \"people\" [{ :name \"Joe\", :age 30 }, { :name \"Paul\", :age 27 }] WriteConcern/NORMAL)
|
||||
|
||||
"
|
||||
([^String collection, ^List documents]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.insert ^DBCollection coll ^List (to-db-object documents) ^WriteConcern monger.core/*mongodb-write-concern*)))
|
||||
([^String collection, ^List documents, ^WriteConcern concern]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.insert ^DBCollection coll ^List (to-db-object documents) ^WriteConcern concern)))
|
||||
([^DB db ^String collection, ^List documents, ^WriteConcern concern]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.insert ^DBCollection coll ^List (to-db-object documents) ^WriteConcern concern))))
|
||||
|
||||
;;
|
||||
;; monger.collection/find
|
||||
;;
|
||||
|
||||
(defn ^DBCursor find
|
||||
"Queries for objects in this collection.
|
||||
This function returns DBCursor, which allows you to iterate over DBObjects.
|
||||
If you want to manipulate clojure sequences maps, please @find-maps@.
|
||||
|
||||
EXAMPLES:
|
||||
;; return all objects in this collection.
|
||||
(mgcol/find \"people\")
|
||||
|
||||
;; return all objects matching query
|
||||
(mgcol/find \"people\" { :company \"Comp Corp\"})
|
||||
|
||||
;; return all objects matching query, taking only specified fields
|
||||
(mgcol/find \"people\" { :company \"Comp Corp\"} [:first_name :last_name])
|
||||
"
|
||||
([^String collection]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.find coll)))
|
||||
([^String collection ^Map ref]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.find ^DBCollection coll ^DBObject (to-db-object ref))))
|
||||
([^String collection ^Map ref ^List fields]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)
|
||||
map-of-fields (fields-to-db-object fields)]
|
||||
(.find ^DBCollection coll ^DBObject (to-db-object ref) ^DBObject (to-db-object map-of-fields))))
|
||||
([^DB db ^String collection ^Map ref ^List fields]
|
||||
(let [^DBCollection coll (.getCollection db collection)
|
||||
map-of-fields (fields-to-db-object fields)]
|
||||
(.find ^DBCollection coll ^DBObject (to-db-object ref) ^DBObject (to-db-object map-of-fields)))))
|
||||
|
||||
(defn ^ISeq find-maps
|
||||
"Queries for objects in this collection.
|
||||
This function returns clojure Seq of Maps.
|
||||
If you want to work directly with DBObject, use find.
|
||||
"
|
||||
([^String collection]
|
||||
(map (fn [x] (from-db-object x true)) (find collection)))
|
||||
([^String collection ^Map ref]
|
||||
(map (fn [x] (from-db-object x true)) (find collection ref)))
|
||||
([^String collection ^Map ref ^List fields]
|
||||
(map (fn [x] (from-db-object x true)) (find collection ref fields)))
|
||||
([^DB db ^String collection ^Map ref ^List fields]
|
||||
(map (fn [x] (from-db-object x true)) (find db collection ref fields))))
|
||||
|
||||
(defn ^ISeq find-seq
|
||||
"Queries for objects in this collection, returns ISeq of DBObjects."
|
||||
([^String collection]
|
||||
(seq (find collection)))
|
||||
([^String collection ^Map ref]
|
||||
(seq (find collection ref)))
|
||||
([^String collection ^Map ref ^List fields]
|
||||
(seq (find collection ref fields)))
|
||||
([^DB db ^String collection ^Map ref ^List fields]
|
||||
(seq (find db collection ref fields))))
|
||||
|
||||
;;
|
||||
;; monger.collection/find-one
|
||||
;;
|
||||
|
||||
(defn ^DBObject find-one
|
||||
"Returns a single DBObject from this collection matching the query.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
(mgcol/find-one collection { :language \"Clojure\" })
|
||||
|
||||
;; Return only :language field.
|
||||
;; Note that _id field is always returned.
|
||||
(mgcol/find-one collection { :language \"Clojure\" } [:language])
|
||||
|
||||
"
|
||||
([^String collection ^Map ref]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.findOne ^DBCollection coll ^DBObject (to-db-object ref))))
|
||||
([^String collection ^Map ref ^List fields]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)
|
||||
map-of-fields (fields-to-db-object fields)]
|
||||
(.findOne ^DBCollection coll ^DBObject (to-db-object ref) ^DBObject (to-db-object map-of-fields))))
|
||||
([^DB db ^String collection ^Map ref ^List fields]
|
||||
(let [^DBCollection coll (.getCollection db collection)
|
||||
map-of-fields (fields-to-db-object fields)]
|
||||
(.findOne ^DBCollection coll ^DBObject (to-db-object ref) ^DBObject (to-db-object map-of-fields)))))
|
||||
|
||||
(defn ^IPersistentMap find-one-as-map
|
||||
"Returns a single object converted to Map from this collection matching the query."
|
||||
([^String collection ^Map ref]
|
||||
(from-db-object ^DBObject (find-one collection ref) true))
|
||||
([^String collection ^Map ref ^List fields]
|
||||
(from-db-object ^DBObject (find-one collection ref fields) true))
|
||||
([^String collection ^Map ref ^List fields keywordize]
|
||||
(from-db-object ^DBObject (find-one collection ref fields) keywordize)))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/find-by-id
|
||||
;;
|
||||
|
||||
(defn ^DBObject find-by-id
|
||||
"Returns a single object with matching _id field.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
(mgcol/find-one-by-id collection (ObjectId. \"4ef45ab4744e9fd632640e2d\"))
|
||||
|
||||
;; Return only :language field.
|
||||
;; Note that _id field is always returned.
|
||||
(mgcol/find-one-by-id collection (ObjectId. \"4ef45ab4744e9fd632640e2d\") [:language])
|
||||
"
|
||||
([^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one collection { :_id id }))
|
||||
([^String collection id ^List fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one collection { :_id id } fields))
|
||||
([^DB db ^String collection id ^List fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(find-one db collection { :_id id } fields)))
|
||||
|
||||
(defn ^IPersistentMap find-map-by-id
|
||||
"Returns a single object, converted to map with matching _id field."
|
||||
([^String collection id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(from-db-object ^DBObject (find-one-as-map collection { :_id id }) true))
|
||||
([^String collection id ^List fields]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(from-db-object ^DBObject (find-one-as-map collection { :_id id } fields) true))
|
||||
([^String collection id ^List fields keywordize]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(from-db-object ^DBObject (find-one-as-map collection { :_id id } fields) keywordize)))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/group
|
||||
;;
|
||||
|
||||
|
||||
;; TBD
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/count
|
||||
;;
|
||||
(defn count
|
||||
"Returns the number of documents in this collection.
|
||||
|
||||
Takes optional conditions as an argument.
|
||||
|
||||
(monger.collection/count collection)
|
||||
|
||||
(monger.collection/count collection { :first_name \"Paul\" })"
|
||||
(^long [^String collection]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.count coll)))
|
||||
(^long [^String collection ^Map conditions]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.count coll (to-db-object conditions))))
|
||||
(^long [^DB db ^String collection ^Map conditions]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.count coll (to-db-object conditions)))))
|
||||
|
||||
(defn any?
|
||||
"Wether the collection has any items at all, or items matching query.
|
||||
|
||||
EXAMPLES:
|
||||
|
||||
;; wether the collection has any items
|
||||
(mgcol/any? collection)
|
||||
|
||||
(mgcol/any? collection { :language \"Clojure\" }))
|
||||
"
|
||||
([^String collection]
|
||||
(> (count collection) 0))
|
||||
([^String collection ^Map conditions]
|
||||
(> (count collection conditions) 0))
|
||||
([^DB db ^String collection ^Map conditions]
|
||||
(> (count db collection conditions) 0)))
|
||||
|
||||
|
||||
(defn empty?
|
||||
"Wether the collection is empty.
|
||||
|
||||
EXAMPLES:
|
||||
(mgcol/empty? \"things\")
|
||||
"
|
||||
([^String collection]
|
||||
(= (count collection) 0))
|
||||
([^DB db ^String collection]
|
||||
(= (count db collection {}) 0)))
|
||||
|
||||
;; monger.collection/update
|
||||
|
||||
(defn ^WriteResult update
|
||||
"Performs an update operation.
|
||||
|
||||
Please note that update is potentially destructive operation. It will update your document with the given set
|
||||
emptying the fields not mentioned in (^Map document). In order to only change certain fields, please use
|
||||
\"$set\".
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/update \"people\" { :first_name \"Raul\" } { \"$set\" { :first_name \"Paul\" } })
|
||||
|
||||
You can use all the Mongodb Modifier Operations ($inc, $set, $unset, $push, $pushAll, $addToSet, $pop, $pull
|
||||
$pullAll, $rename, $bit) here, as well
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/update \"people\" { :first_name \"Paul\" } { \"$set\" { :index 1 } })
|
||||
(monger.collection/update \"people\" { :first_name \"Paul\" } { \"$inc\" { :index 5 } })
|
||||
|
||||
(monger.collection/update \"people\" { :first_name \"Paul\" } { \"$unset\" { :years_on_stage 1} })
|
||||
|
||||
It also takes modifiers, such as :upsert and :multi.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
;; add :band field to all the records found in \"people\" collection, otherwise only the first matched record
|
||||
;; will be updated
|
||||
(monger.collection/update \"people\" { } { \"$set\" { :band \"The Beatles\" }} :multi true)
|
||||
|
||||
;; inserts the record if it did not exist in the collection
|
||||
(monger.collection/update \"people\" { :first_name \"Yoko\" } { :first_name \"Yoko\" :last_name \"Ono\" } :upsert true)
|
||||
|
||||
By default :upsert and :multi are false."
|
||||
([^String collection ^Map conditions ^Map document & { :keys [upsert multi write-concern] :or { upsert false
|
||||
multi false
|
||||
write-concern monger.core/*mongodb-write-concern* } }]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.update coll (to-db-object conditions) (to-db-object document) upsert multi write-concern))))
|
||||
|
||||
(defn ^WriteResult update-by-id
|
||||
"Update a document with given id"
|
||||
[^String collection ^ObjectId id ^Map document & { :keys [upsert write-concern] :or { upsert false
|
||||
write-concern monger.core/*mongodb-write-concern* } }]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.update coll (to-db-object { :_id id }) (to-db-object document) upsert false write-concern)))
|
||||
|
||||
|
||||
;; monger.collection/save
|
||||
|
||||
(defn ^WriteResult save
|
||||
"Saves an object to the given collection (does insert or update based on the object _id).
|
||||
|
||||
If the object is not present in the database, insert operation will be performed.
|
||||
If the object is already in the database, it will be updated.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/save \"people\" { :first_name \"Ian\" :last_name \"Gillan\" })
|
||||
"
|
||||
([^String collection ^Map document]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.save coll (to-db-object document) monger.core/*mongodb-write-concern*)))
|
||||
([^String collection ^Map document ^WriteConcern write-concern]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.save coll document write-concern)))
|
||||
([^DB db ^String collection ^Map document ^WriteConcern write-concern]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.save coll document write-concern))))
|
||||
|
||||
|
||||
;; monger.collection/remove
|
||||
|
||||
(defn ^WriteResult remove
|
||||
"Removes objects from the database.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/remove collection) ;; Removes all documents from DB
|
||||
|
||||
(monger.collection/remove collection { :language \"Clojure\" }) ;; Removes documents based on given query
|
||||
|
||||
"
|
||||
([^String collection]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.remove coll (to-db-object {}))))
|
||||
([^String collection ^Map conditions]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.remove coll (to-db-object conditions))))
|
||||
([^DB db ^String collection ^Map conditions]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.remove coll (to-db-object conditions)))))
|
||||
|
||||
|
||||
(defn ^WriteResult remove-by-id
|
||||
"Removes a single document with given id"
|
||||
([^String collection ^ObjectId id]
|
||||
(remove-by-id monger.core/*mongodb-database* collection id))
|
||||
([^DB db ^String collection ^ObjectId id]
|
||||
(check-not-nil! id "id must not be nil")
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.remove coll (to-db-object { :_id id })))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/create-index
|
||||
;;
|
||||
|
||||
(defn create-index
|
||||
"Forces creation of index on a set of fields, if one does not already exists.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
;; Will create an index on \"language\" field
|
||||
(monger.collection/create-index collection { \"language\" 1 })
|
||||
|
||||
"
|
||||
([^String collection ^Map keys]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.createIndex coll (to-db-object keys))))
|
||||
([^DB db ^String collection ^Map keys]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.createIndex coll (to-db-object keys)))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/ensure-index
|
||||
;;
|
||||
|
||||
(defn ensure-index
|
||||
"Creates an index on a set of fields, if one does not already exist.
|
||||
ensureIndex in Java driver is optimized and is inexpensive if the index already exists.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/ensure-index collection { \"language\" 1 })
|
||||
|
||||
"
|
||||
([^String collection, ^Map keys]
|
||||
(let [coll ^DBCollection (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.ensureIndex ^DBCollection coll ^DBObject (to-db-object keys))))
|
||||
([^String collection, ^Map keys, ^String name]
|
||||
(let [coll ^DBCollection (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.ensureIndex coll ^DBObject (to-db-object keys) ^String name))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/indexes-on
|
||||
;;
|
||||
|
||||
(defn indexes-on
|
||||
"Return a list of the indexes for this collection.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.collection/indexes-on collection)
|
||||
|
||||
"
|
||||
[^String collection]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(from-db-object (.getIndexInfo coll) true)))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/drop-index
|
||||
;;
|
||||
|
||||
(defn drop-index
|
||||
"Drops an index from this collection."
|
||||
([^String collection ^String name]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.dropIndex coll name)))
|
||||
([^DB db ^String collection ^String name]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.dropIndex coll name))))
|
||||
|
||||
(defn drop-indexes
|
||||
"Drops an indices from this collection."
|
||||
([^String collection]
|
||||
(.dropIndexes ^DBCollection (.getCollection monger.core/*mongodb-database* collection)))
|
||||
([^DB db ^String collection]
|
||||
(.dropIndexes ^DBCollection (.getCollection db collection))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/exists?, /create, /drop, /rename
|
||||
;;
|
||||
|
||||
|
||||
(defn exists?
|
||||
"Checks weather collection with certain name exists.
|
||||
|
||||
EXAMPLE:
|
||||
|
||||
(monger.collection/exists? \"coll\")
|
||||
"
|
||||
([^String collection]
|
||||
(.collectionExists monger.core/*mongodb-database* collection))
|
||||
([^DB db ^String collection]
|
||||
(.collectionExists db collection)))
|
||||
|
||||
(defn create
|
||||
"Creates a collection with a given name and options."
|
||||
([^String collection ^Map options]
|
||||
(.createCollection monger.core/*mongodb-database* collection (to-db-object options)))
|
||||
([^DB db ^String collection ^Map options]
|
||||
(.createCollection db collection (to-db-object options))))
|
||||
|
||||
(defn drop
|
||||
"Deletes collection from database.
|
||||
|
||||
EXAMPLE:
|
||||
|
||||
(monger.collection/drop \"collection-to-drop\")
|
||||
"
|
||||
([^String collection]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.drop coll)))
|
||||
([^DB db ^String collection]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.drop coll))))
|
||||
|
||||
(defn rename
|
||||
"Renames collection.
|
||||
|
||||
EXAMPLE:
|
||||
|
||||
(monger.collection/rename \"old_name\" \"new_name\")
|
||||
"
|
||||
([^String from, ^String to]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* from)]
|
||||
(.rename coll to)))
|
||||
([^String from ^String to ^Boolean drop-target]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* from)]
|
||||
(.rename coll to drop-target)))
|
||||
([^DB db ^String from ^String to ^Boolean drop-target]
|
||||
(let [^DBCollection coll (.getCollection db from)]
|
||||
(.rename coll to drop-target))))
|
||||
|
||||
;;
|
||||
;; Map/Reduce
|
||||
;;
|
||||
|
||||
(defn map-reduce
|
||||
"Performs a map reduce operation"
|
||||
([^String collection, ^String js-mapper, ^String js-reducer, ^String output, ^Map query]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.mapReduce coll js-mapper js-reducer output (to-db-object query))))
|
||||
([^String collection, ^String js-mapper, ^String js-reducer, ^String output, ^MapReduceCommand$OutputType output-type, ^Map query]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.mapReduce coll js-mapper js-reducer output output-type (to-db-object query)))))
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/distinct
|
||||
;;
|
||||
|
||||
(defn distinct
|
||||
"Finds distinct values for a key"
|
||||
([^String collection ^String key]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.distinct coll ^String (to-db-object key))))
|
||||
([^String collection ^String key ^Map query]
|
||||
(let [^DBCollection coll (.getCollection monger.core/*mongodb-database* collection)]
|
||||
(.distinct coll ^String (to-db-object key) ^DBObject (to-db-object query))))
|
||||
([^DB db ^String collection ^String key ^Map query]
|
||||
(let [^DBCollection coll (.getCollection db collection)]
|
||||
(.distinct coll ^String (to-db-object key) ^DBObject (to-db-object query)))))
|
||||
47
src/monger/command.clj
Normal file
47
src/monger/command.clj
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.command
|
||||
(:use [monger.conversion])
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty?])
|
||||
(:import [com.mongodb Mongo DB DBObject CommandResult]
|
||||
[java.util Map])
|
||||
(:require [monger core]))
|
||||
|
||||
|
||||
(defn collection-stats
|
||||
([collection]
|
||||
(collection-stats monger.core/*mongodb-database* collection))
|
||||
([^DB database collection]
|
||||
(monger.core/command database { :collstats collection })))
|
||||
|
||||
(defn db-stats
|
||||
([]
|
||||
(db-stats monger.core/*mongodb-database*))
|
||||
([^DB database]
|
||||
(monger.core/command database {:dbStats 1 })))
|
||||
|
||||
|
||||
(defn reindex-collection
|
||||
([collection]
|
||||
(reindex-collection monger.core/*mongodb-database* collection))
|
||||
([^DB database collection]
|
||||
(monger.core/command database { :reIndex collection })))
|
||||
|
||||
|
||||
(defn server-status
|
||||
([]
|
||||
(server-status monger.core/*mongodb-database*))
|
||||
([^DB database]
|
||||
(monger.core/command database {:serverStatus 1 })))
|
||||
|
||||
|
||||
(defn top []
|
||||
(monger.core/command (monger.core/get-db "admin") {:top 1 }))
|
||||
123
src/monger/conversion.clj
Normal file
123
src/monger/conversion.clj
Normal file
|
|
@ -0,0 +1,123 @@
|
|||
;; Original author is Andrew Boekhoff
|
||||
;;
|
||||
;; Portions of the code are Copyright (c) 2009 Andrew Boekhoff
|
||||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
;; of this software and associated documentation files (the "Software"), to deal
|
||||
;; in the Software without restriction, including without limitation the rights
|
||||
;; to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
;; copies of the Software, and to permit persons to whom the Software is
|
||||
;; furnished to do so, subject to the following conditions:
|
||||
;;
|
||||
;; The above copyright notice and this permission notice shall be included in
|
||||
;; all copies or substantial portions of the Software.
|
||||
;;
|
||||
;; THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
;; IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
;; FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
;; AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
;; LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
;; OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
;; THE SOFTWARE.
|
||||
|
||||
(ns monger.conversion
|
||||
(:import [com.mongodb DBObject BasicDBObject BasicDBList DBCursor]
|
||||
[clojure.lang IPersistentMap Keyword]
|
||||
[java.util List Map Date]
|
||||
[org.bson.types ObjectId]))
|
||||
|
||||
(defprotocol ConvertToDBObject
|
||||
(to-db-object [input] "Converts given piece of Clojure data to BasicDBObject MongoDB Java driver uses"))
|
||||
|
||||
(extend-protocol ConvertToDBObject
|
||||
nil
|
||||
(to-db-object [input]
|
||||
input)
|
||||
|
||||
Object
|
||||
(to-db-object [input]
|
||||
input)
|
||||
|
||||
Keyword
|
||||
(to-db-object [^Keyword input] (.getName input))
|
||||
|
||||
IPersistentMap
|
||||
(to-db-object [^IPersistentMap input]
|
||||
(let [o (BasicDBObject.)]
|
||||
(doseq [[k v] input]
|
||||
(.put o (to-db-object k) (to-db-object v)))
|
||||
o))
|
||||
|
||||
List
|
||||
(to-db-object [^List input] (map to-db-object input))
|
||||
|
||||
DBObject
|
||||
(to-db-object [^DBObject input] input))
|
||||
|
||||
|
||||
|
||||
|
||||
(declare associate-pairs)
|
||||
(defprotocol ConvertFromDBObject
|
||||
(from-db-object [input keywordize] "Converts given DBObject instance to a piece of Clojure data"))
|
||||
|
||||
(extend-protocol ConvertFromDBObject
|
||||
nil
|
||||
(from-db-object [input keywordize] input)
|
||||
|
||||
Object
|
||||
(from-db-object [input keywordize] input)
|
||||
|
||||
Map
|
||||
(from-db-object [^Map input keywordize]
|
||||
(associate-pairs (.entrySet input) keywordize))
|
||||
|
||||
List
|
||||
(from-db-object [^List input keywordize]
|
||||
(vec (map #(from-db-object % keywordize) input)))
|
||||
|
||||
BasicDBList
|
||||
(from-db-object [^BasicDBList input keywordize]
|
||||
(vec (map #(from-db-object % keywordize) input)))
|
||||
|
||||
DBObject
|
||||
(from-db-object [^DBObject input keywordize]
|
||||
;; DBObject provides .toMap, but the implementation in
|
||||
;; subclass GridFSFile unhelpfully throws
|
||||
;; UnsupportedOperationException. This part is taken from congomongo and
|
||||
;; may need revisiting at a later point. MK.
|
||||
(associate-pairs (for [key-set (.keySet input)] [key-set (.get input key-set)])
|
||||
keywordize)))
|
||||
|
||||
|
||||
(defn- associate-pairs [pairs keywordize]
|
||||
;; Taking the keywordize test out of the fn reduces derefs
|
||||
;; dramatically, which was the main barrier to matching pure-Java
|
||||
;; performance for this marshalling. Taken from congomongo. MK.
|
||||
(reduce (if keywordize
|
||||
(fn [m [^String k v]]
|
||||
(assoc m (keyword k) (from-db-object v true)))
|
||||
(fn [m [^String k v]]
|
||||
(assoc m k (from-db-object v false))))
|
||||
{} (reverse pairs)))
|
||||
|
||||
|
||||
|
||||
(defprotocol ConvertToObjectId
|
||||
(to-object-id [input] "Instantiates ObjectId from input unless the input itself is an ObjectId instance. In that case, returns input as is."))
|
||||
|
||||
(extend-protocol ConvertToObjectId
|
||||
String
|
||||
(to-object-id [^String input]
|
||||
(ObjectId. input))
|
||||
|
||||
Date
|
||||
(to-object-id [^Date input]
|
||||
(ObjectId. input))
|
||||
|
||||
ObjectId
|
||||
(to-object-id [^ObjectId input]
|
||||
input))
|
||||
|
||||
|
||||
260
src/monger/core.clj
Normal file
260
src/monger/core.clj
Normal file
|
|
@ -0,0 +1,260 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns ^{:author "Michael S. Klishin"
|
||||
:doc "Thin idiomatic wrapper around MongoDB Java client. monger.core includes
|
||||
fundamental functions that work with connections & databases. Most of functionality
|
||||
is in other monger.* namespaces, in particular monger.collection."}
|
||||
monger.core
|
||||
(:refer-clojure :exclude [count])
|
||||
(:use [monger.conversion])
|
||||
(:import [com.mongodb Mongo DB WriteConcern DBObject DBCursor CommandResult Bytes MongoOptions ServerAddress]
|
||||
[com.mongodb.gridfs GridFS]
|
||||
[java.util Map]))
|
||||
|
||||
;;
|
||||
;; Defaults
|
||||
;;
|
||||
|
||||
(def ^:dynamic ^String *mongodb-host* "localhost")
|
||||
(def ^:dynamic ^long *mongodb-port* 27017)
|
||||
|
||||
(declare ^:dynamic ^Mongo *mongodb-connection*)
|
||||
(declare ^:dynamic ^DB *mongodb-database*)
|
||||
(def ^:dynamic ^WriteConcern *mongodb-write-concern* WriteConcern/SAFE)
|
||||
|
||||
(declare ^:dynamic ^GridFS *mongodb-gridfs*)
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^Mongo connect
|
||||
"Connects to MongoDB. When used without arguments, connects to
|
||||
|
||||
Arguments:
|
||||
:host (*mongodb-host* by default)
|
||||
:port (*mongodb-port* by default)
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.core/connect)
|
||||
(monger.core/connect { :host \"db3.intranet.local\", :port 27787 })
|
||||
"
|
||||
([]
|
||||
(Mongo.))
|
||||
([^ServerAddress server-address ^MongoOptions options]
|
||||
(Mongo. server-address options))
|
||||
([{ :keys [host port] :or { host *mongodb-host*, port *mongodb-port* }}]
|
||||
(Mongo. ^String host ^Long port)))
|
||||
|
||||
|
||||
|
||||
(defn ^DB get-db-names
|
||||
"Gets a list of all database names present on the server"
|
||||
([]
|
||||
(get-db-names *mongodb-connection*))
|
||||
([^Mongo connection]
|
||||
(set (.getDatabaseNames connection))))
|
||||
|
||||
|
||||
(defn ^DB get-db
|
||||
"Get database reference by name.
|
||||
|
||||
EXAMPLES
|
||||
|
||||
(monger.core/get-db \"myapp_production\")
|
||||
(monger.core/get-db connection \"myapp_production\")"
|
||||
([^String name]
|
||||
(.getDB *mongodb-connection* name))
|
||||
([^Mongo connection, ^String name]
|
||||
(.getDB connection name)))
|
||||
|
||||
|
||||
(defn authenticate
|
||||
([^String db ^String username ^chars password]
|
||||
(authenticate *mongodb-connection* db username password))
|
||||
([^Mongo connection ^String db ^String username ^chars password]
|
||||
(.authenticate (.getDB connection db) username password)))
|
||||
|
||||
|
||||
|
||||
(defmacro with-connection
|
||||
[conn & body]
|
||||
`(binding [*mongodb-connection* ~conn]
|
||||
(do ~@body)))
|
||||
|
||||
|
||||
(defmacro with-db
|
||||
[db & body]
|
||||
`(binding [*mongodb-database* ~db]
|
||||
(do ~@body)))
|
||||
|
||||
(defmacro with-gridfs
|
||||
[fs & body]
|
||||
`(binding [*mongodb-gridfs* ~fs]
|
||||
(do ~@body)))
|
||||
|
||||
|
||||
(defn server-address
|
||||
([^String hostname]
|
||||
(ServerAddress. hostname))
|
||||
([^String hostname ^long port]
|
||||
(ServerAddress. hostname port)))
|
||||
|
||||
|
||||
(defn mongo-options
|
||||
[& { :keys [connections-per-host threads-allowed-to-block-for-connection-multiplier
|
||||
max-wait-time connect-timeout socket-timeout socket-keep-alive auto-connect-retry max-auto-connect-retry-time
|
||||
safe w w-timeout fsync j] }]
|
||||
(let [mo (MongoOptions.)]
|
||||
(when connections-per-host
|
||||
(set! (. mo connectionsPerHost) connections-per-host))
|
||||
(when threads-allowed-to-block-for-connection-multiplier
|
||||
(set! (. mo threadsAllowedToBlockForConnectionMultiplier) threads-allowed-to-block-for-connection-multiplier))
|
||||
(when max-wait-time
|
||||
(set! (. mo maxWaitTime) max-wait-time))
|
||||
(when connect-timeout
|
||||
(set! (. mo connectTimeout) connect-timeout))
|
||||
(when socket-timeout
|
||||
(set! (. mo socketTimeout) socket-timeout))
|
||||
(when socket-keep-alive
|
||||
(set! (. mo socketKeepAlive) socket-keep-alive))
|
||||
(when auto-connect-retry
|
||||
(set! (. mo autoConnectRetry) auto-connect-retry))
|
||||
(when max-auto-connect-retry-time
|
||||
(set! (. mo maxAutoConnectRetryTime) max-auto-connect-retry-time))
|
||||
(when safe
|
||||
(set! (. mo safe) safe))
|
||||
(when w
|
||||
(set! (. mo w) w))
|
||||
(when w-timeout
|
||||
(set! (. mo wtimeout) w-timeout))
|
||||
(when j
|
||||
(set! (. mo j) j))
|
||||
(when fsync
|
||||
(set! (. mo fsync) fsync))
|
||||
mo))
|
||||
|
||||
|
||||
(defn set-connection!
|
||||
"Sets given MongoDB connection as default by altering *mongodb-connection* var"
|
||||
^Mongo [^Mongo conn]
|
||||
(alter-var-root (var *mongodb-connection*) (constantly conn)))
|
||||
|
||||
(defn connect!
|
||||
"Connect to MongoDB, store connection in the *mongodb-connection* var"
|
||||
^Mongo [& args]
|
||||
(let [c (apply connect args)]
|
||||
(set-connection! c)))
|
||||
|
||||
|
||||
|
||||
(defn set-db!
|
||||
"Sets *mongodb-database* var to given db, updates *mongodb-gridfs* var state. Recommended to be used for
|
||||
applications that only use one database."
|
||||
[db]
|
||||
(alter-var-root (var *mongodb-database*) (constantly db))
|
||||
(alter-var-root (var *mongodb-gridfs*) (constantly (GridFS. db))))
|
||||
|
||||
|
||||
(defn set-default-write-concern!
|
||||
[wc]
|
||||
"Set *mongodb-write-concert* var to :wc
|
||||
|
||||
Unlike the official Java driver, Monger uses WriteConcern/SAFE by default. We think defaults should be safe first
|
||||
and WebScale fast second."
|
||||
(def ^:dynamic *mongodb-write-concern* wc))
|
||||
|
||||
(defn ^CommandResult command
|
||||
"Runs a database command (please check MongoDB documentation for the complete list of commands). Some common commands
|
||||
are:
|
||||
|
||||
{ :buildinfo 1 } returns version number and build information about the current MongoDB server, should be executed via admin DB.
|
||||
|
||||
{ :collstats collection-name [ :scale scale ] } returns stats about given collection.
|
||||
|
||||
{ :dbStats 1 } returns the stats of current database
|
||||
|
||||
{ :dropDatabase 1 } deletes the current database
|
||||
|
||||
{ :findAndModify find-and-modify-config } runs find, modify and return for the given query.
|
||||
Takes :query, :sory, :remove, :update, :new, :fields and :upsert arguments.
|
||||
Please refer MongoDB documentation for details. http://www.mongodb.org/display/DOCS/findAndModify+Command
|
||||
|
||||
{ :fsync config } performs a full fsync, that flushes all pending writes to database, provides an optional write lock that will make
|
||||
backups easier.
|
||||
Please refer MongoDB documentation for details :http://www.mongodb.org/display/DOCS/fsync+Command
|
||||
|
||||
{ :getLastError 1 } returns the status of the last operation on current connection.
|
||||
|
||||
{ :group group-config } performs grouping aggregation, docs and support for grouping are TBD in Monger.
|
||||
|
||||
{ :listCommands 1 } displays the list of available commands.
|
||||
|
||||
{ :profile new-profile-level } sets the database profiler to profile level N.
|
||||
|
||||
{ :reIndex coll } performs re-index on a given collection.
|
||||
|
||||
{ :renameCollection old-name :to new-name } renames collection from old-name to new-name
|
||||
|
||||
{ :repairDatabase 1 } repair and compact the current database (may be very time-consuming, depending on DB size)
|
||||
|
||||
Replica set commands
|
||||
{ :isMaster 1 } checks if this server is a master server.
|
||||
{ :replSetGetStatus 1 } get the status of a replica set.
|
||||
{ :replSetInitiate replica-config } initiate a replica set with given config.
|
||||
{ :replSetReconfig replica-config } set a given config for replica set.
|
||||
{ :replSetStepDown seconds } manually tell a member to step down as primary. It will become primary again after specified amount of seconds.
|
||||
{ :replSetFreeze seconds } freeze state of member, call with 0 to unfreeze.
|
||||
{ :resync 1 } start a full resync of a replica slave
|
||||
For more information, please refer Mongodb Replica Set Command guide: http://www.mongodb.org/display/DOCS/Replica+Set+Commands
|
||||
|
||||
{ :serverStatus 1 } gets administrative statistics about the server.
|
||||
|
||||
{ :shutdown 1 } shuts the MongoDB server down.
|
||||
|
||||
{ :top 1 } get a breakdown of usage by collection.
|
||||
|
||||
{ :validate namespace-name } validate the namespace (collection or index). May be very time-consuming, depending on DB size.
|
||||
|
||||
For :distinct, :count, :drop, :dropIndexes, :mapReduce we suggest to use monger/collection #distinct, #count, #drop, #dropIndexes, :mapReduce respectively.
|
||||
"
|
||||
([^Map cmd]
|
||||
(.command ^DB *mongodb-database* ^DBObject (to-db-object cmd)))
|
||||
([^DB database ^Map cmd]
|
||||
(.command ^DB database ^DBObject (to-db-object cmd))))
|
||||
|
||||
(defprotocol Countable
|
||||
(count [this] "Returns size of the object"))
|
||||
|
||||
(extend-protocol Countable
|
||||
DBCursor
|
||||
(count [^DBCursor this]
|
||||
(.count this)))
|
||||
|
||||
(defn ^DBObject get-last-error
|
||||
"Returns the the error (if there is one) from the previous operation on this connection.
|
||||
|
||||
The result of this command looks like:
|
||||
|
||||
#<CommandResult { \"serverUsed\" : \"127.0.0.1:27017\" , \"n\" : 0 , \"connectionId\" : 66 , \"err\" : null , \"ok\" : 1.0}>\"
|
||||
|
||||
The value for err will be null if no error occurred, or a description otherwise.
|
||||
|
||||
Important note: when calling this method directly, it is undefined which connection \"getLastError\" is called on.
|
||||
You may need to explicitly use a \"consistent Request\", see requestStart() For most purposes it is better not to call this method directly but instead use WriteConcern."
|
||||
([]
|
||||
(.getLastError ^DB *mongodb-database*))
|
||||
([^DB database]
|
||||
(.getLastError ^DB database))
|
||||
([^DB database ^Integer w ^Integer wtimeout ^Boolean fsync]
|
||||
(.getLastError ^DB database w wtimeout fsync))
|
||||
([^DB database ^WriteConcern write-concern]
|
||||
(.getLastError ^DB database write-concern)))
|
||||
39
src/monger/db.clj
Normal file
39
src/monger/db.clj
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;; Copyright (c) 2012 Toby Hede
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.db
|
||||
(:refer-clojure :exclude [find remove count drop distinct empty?])
|
||||
(:import [com.mongodb Mongo DB DBCollection])
|
||||
(:require [monger core]))
|
||||
|
||||
|
||||
|
||||
(defn add-user
|
||||
"Adds a new user for this db"
|
||||
([^String username, ^chars password]
|
||||
(.addUser ^DB monger.core/*mongodb-database* username password))
|
||||
([^DB database ^String username, ^chars password]
|
||||
(.addUser ^DB database username password)))
|
||||
|
||||
|
||||
(defn drop-db
|
||||
"Drops the currently set database (via core/set-db) or the specified database."
|
||||
([]
|
||||
(.dropDatabase ^DB monger.core/*mongodb-database*))
|
||||
([^DB database]
|
||||
(.dropDatabase ^DB database)))
|
||||
|
||||
|
||||
(defn get-collection-names
|
||||
"Returns a set containing the names of all collections in this database."
|
||||
([]
|
||||
(set (.getCollectionNames ^DB monger.core/*mongodb-database*)))
|
||||
([^DB database]
|
||||
(set (.getCollectionNames ^DB database))))
|
||||
115
src/monger/gridfs.clj
Normal file
115
src/monger/gridfs.clj
Normal file
|
|
@ -0,0 +1,115 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.gridfs
|
||||
(:refer-clojure :exclude [remove find])
|
||||
(:require [monger.core]
|
||||
[clojure.java.io :as io])
|
||||
(:use [monger.conversion])
|
||||
(:import [com.mongodb DB DBObject]
|
||||
[com.mongodb.gridfs GridFS GridFSInputFile]
|
||||
[java.io InputStream File]))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(def
|
||||
^{:doc "Type object for a Java primitive byte array."
|
||||
:private true
|
||||
}
|
||||
byte-array-type (class (make-array Byte/TYPE 0)))
|
||||
|
||||
;; ...
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
|
||||
(defn remove
|
||||
([]
|
||||
(remove {}))
|
||||
([query]
|
||||
(.remove ^GridFS monger.core/*mongodb-gridfs* ^DBObject (to-db-object query)))
|
||||
([^GridFS fs query]
|
||||
(.remove fs ^DBObject (to-db-object query))))
|
||||
|
||||
(defn remove-all
|
||||
([]
|
||||
(remove {}))
|
||||
([^GridFS fs]
|
||||
(remove fs {})))
|
||||
|
||||
(defn all-files
|
||||
([]
|
||||
(.getFileList ^GridFS monger.core/*mongodb-gridfs*))
|
||||
([query]
|
||||
(.getFileList ^GridFS monger.core/*mongodb-gridfs* query))
|
||||
([^GridFS fs query]
|
||||
(.getFileList fs query)))
|
||||
|
||||
|
||||
(defprotocol GridFSInputFileFactory
|
||||
(^GridFSInputFile make-input-file [input] "Makes GridFSInputFile out of given input"))
|
||||
|
||||
(extend byte-array-type
|
||||
GridFSInputFileFactory
|
||||
{ :make-input-file (fn [^bytes input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* input)) })
|
||||
|
||||
(extend-protocol GridFSInputFileFactory
|
||||
String
|
||||
(make-input-file [^String input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* ^InputStream (io/make-input-stream input { :encoding "UTF-8" })))
|
||||
|
||||
File
|
||||
(make-input-file [^File input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* ^InputStream (io/make-input-stream input { :encoding "UTF-8" })))
|
||||
|
||||
InputStream
|
||||
(make-input-file [^InputStream input]
|
||||
(.createFile ^GridFS monger.core/*mongodb-gridfs* ^InputStream input)))
|
||||
|
||||
|
||||
(defmacro store
|
||||
[^GridFSInputFile input & body]
|
||||
`(let [^GridFSInputFile f# (doto ~input ~@body)]
|
||||
(.save f# GridFS/DEFAULT_CHUNKSIZE)
|
||||
(from-db-object f# true)))
|
||||
|
||||
|
||||
(defprotocol Finders
|
||||
(find [input] "Finds multiple files using given input (an ObjectId, filename or query)")
|
||||
(find-one [input] "Finds one file using given input (an ObjectId, filename or query)"))
|
||||
|
||||
(extend-protocol Finders
|
||||
String
|
||||
(find [^String input]
|
||||
(vec (.find ^GridFS monger.core/*mongodb-gridfs* input)))
|
||||
(find-one [^String input]
|
||||
(.findOne ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
|
||||
org.bson.types.ObjectId
|
||||
(find-one [^org.bson.types.ObjectId input]
|
||||
(.findOne ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
|
||||
|
||||
DBObject
|
||||
(find [^DBObject input]
|
||||
(vec (.find ^GridFS monger.core/*mongodb-gridfs* input)))
|
||||
(find-one [^DBObject input]
|
||||
(.findOne ^GridFS monger.core/*mongodb-gridfs* input))
|
||||
|
||||
clojure.lang.PersistentArrayMap
|
||||
(find [^clojure.lang.PersistentArrayMap input]
|
||||
(find (to-db-object input))))
|
||||
|
||||
73
src/monger/internal/fn.clj
Normal file
73
src/monger/internal/fn.clj
Normal file
|
|
@ -0,0 +1,73 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.internal.fn)
|
||||
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(defn- apply-to-values [m f]
|
||||
"Applies function f to all values in map m"
|
||||
(into {} (for [[k v] m]
|
||||
[k (f v)])))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn fpartial
|
||||
"Like clojure.core/partial but prepopulates last N arguments (first is passed in later)"
|
||||
[f & args]
|
||||
(fn [arg & more] (apply f arg (concat args more))))
|
||||
|
||||
(defprotocol IFNExpansion
|
||||
(expand-all [x] "Replaces functions with their invocation results, recursively expands maps, evaluates all other values to themselves")
|
||||
(expand-all-with [x f] "Replaces functions with their invocation results that function f is applied to, recursively expands maps, evaluates all other values to themselves"))
|
||||
|
||||
(extend-protocol IFNExpansion
|
||||
java.lang.Integer
|
||||
(expand-all [i] i)
|
||||
(expand-all-with [i f] i)
|
||||
|
||||
java.lang.Long
|
||||
(expand-all [l] l)
|
||||
(expand-all-with [l f] l)
|
||||
|
||||
java.lang.String
|
||||
(expand-all [s] s)
|
||||
(expand-all-with [s f] s)
|
||||
|
||||
java.lang.Float
|
||||
(expand-all [fl] fl)
|
||||
(expand-all-with [fl f] fl)
|
||||
|
||||
java.lang.Double
|
||||
(expand-all [d] d)
|
||||
(expand-all-with [d f] d)
|
||||
|
||||
;; maps are also functions, so be careful here. MK.
|
||||
clojure.lang.IPersistentMap
|
||||
(expand-all [m] (apply-to-values m expand-all))
|
||||
(expand-all-with [m f] (apply-to-values m (fpartial expand-all-with f)))
|
||||
|
||||
clojure.lang.PersistentVector
|
||||
(expand-all [v] (map expand-all v))
|
||||
(expand-all-with [v f] (map (fpartial expand-all-with f) v))
|
||||
|
||||
;; this distinguishes functions from maps, sets and so on, which are also
|
||||
;; clojure.lang.AFn subclasses. MK.
|
||||
clojure.lang.AFunction
|
||||
(expand-all [f] (f))
|
||||
(expand-all-with [f expander] (expander f))
|
||||
|
||||
Object
|
||||
(expand-all [x] x)
|
||||
(expand-all-with [x f] x))
|
||||
15
src/monger/internal/pagination.clj
Normal file
15
src/monger/internal/pagination.clj
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.internal.pagination)
|
||||
|
||||
(defn offset-for
|
||||
[^long page ^long per-page]
|
||||
(* per-page
|
||||
(- (max page 1) 1)))
|
||||
34
src/monger/joda_time.clj
Normal file
34
src/monger/joda_time.clj
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.joda-time
|
||||
(:import [org.joda.time DateTime DateTimeZone ReadableInstant]
|
||||
[org.joda.time.format ISODateTimeFormat])
|
||||
(:use [monger.conversion])
|
||||
(:require [clojure.data.json :as json]))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(extend-protocol ConvertToDBObject
|
||||
org.joda.time.DateTime
|
||||
(to-db-object [^DateTime input]
|
||||
(to-db-object (.toDate input))))
|
||||
|
||||
(extend-protocol ConvertFromDBObject
|
||||
java.util.Date
|
||||
(from-db-object [^java.util.Date input keywordize]
|
||||
(org.joda.time.DateTime. input)))
|
||||
|
||||
|
||||
(extend-protocol json/Write-JSON
|
||||
org.joda.time.DateTime
|
||||
(write-json [^DateTime object out escape-unicode?]
|
||||
(json/write-json (.print (ISODateTimeFormat/dateTime) ^ReadableInstant object) out escape-unicode?)))
|
||||
31
src/monger/js.clj
Normal file
31
src/monger/js.clj
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.js
|
||||
(:require [clojure.java.io :as io]))
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(defn- normalize-resource
|
||||
[^String path]
|
||||
(if (.endsWith path ".js")
|
||||
path
|
||||
(str path ".js")))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn load-resource
|
||||
(^String [^String path]
|
||||
(slurp (io/resource (normalize-resource path)))))
|
||||
21
src/monger/json.clj
Normal file
21
src/monger/json.clj
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.json
|
||||
(:import (org.bson.types ObjectId))
|
||||
(:require [clojure.data.json :as json]))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(extend-protocol json/Write-JSON
|
||||
ObjectId
|
||||
(write-json [^ObjectId object out escape-unicode?]
|
||||
(json/write-json (.toString object) out escape-unicode?)))
|
||||
173
src/monger/operators.clj
Normal file
173
src/monger/operators.clj
Normal file
|
|
@ -0,0 +1,173 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.operators)
|
||||
|
||||
(defmacro ^{:private true} defoperator
|
||||
[operator]
|
||||
(let [op# (str operator)
|
||||
op-sym# (symbol op#)]
|
||||
`(def ~op-sym# ~op#)))
|
||||
|
||||
;;
|
||||
;; QUERY OPERATORS
|
||||
;;
|
||||
|
||||
;; $gt is "greater than" comparator
|
||||
;; $gte is "greater than or equals" comparator
|
||||
;; $gt is "less than" comparator
|
||||
;; $lte is "less than or equals" comparator
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find "libraries" { :users { $gt 10 } })
|
||||
;; (monger.collection/find "libraries" { :users { $gte 10 } })
|
||||
;; (monger.collection/find "libraries" { :users { $lt 10 } })
|
||||
;; (monger.collection/find "libraries" { :users { $lte 10 } })
|
||||
(defoperator $gt)
|
||||
(defoperator $gte)
|
||||
(defoperator $lt)
|
||||
(defoperator $lte)
|
||||
|
||||
;; $all matches all values in the array
|
||||
;;
|
||||
;; EXAMPLES
|
||||
;; (mgcol/find-maps "languages" { :tags { $all [ "functional" "object-oriented" ] } } )
|
||||
(defoperator $all)
|
||||
|
||||
;; The $in operator is analogous to the SQL IN modifier, allowing you to specify an array of possible matches.
|
||||
;;
|
||||
;; EXAMPLES
|
||||
;; (mgcol/find-maps "languages" { :tags { $in [ "functional" "object-oriented" ] } } )
|
||||
(defoperator $in)
|
||||
|
||||
;; The $nin operator is similar to $in, but it selects objects for which the specified field does not
|
||||
;; have any value in the specified array.
|
||||
;;
|
||||
;; EXAMPLES
|
||||
;; (mgcol/find-maps "languages" { :tags { $nin [ "functional" ] } } )
|
||||
(defoperator $nin)
|
||||
|
||||
;; $ne is "non-equals" comparator
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/find "libraries" {$ne { :language "Clojure" }})
|
||||
(defoperator $ne)
|
||||
|
||||
;; $elemMatch checks if an element in an array matches the specified expression
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; ;; Matches element with :text "Nice" and :rating greater than or equal 1
|
||||
;; (monger.collection/find "comments" { $elemMatch { :text "Nice!" :rating { $gte 1 } } })
|
||||
(defoperator $elemMatch)
|
||||
|
||||
|
||||
;;
|
||||
;; LOGIC OPERATORS
|
||||
;;
|
||||
|
||||
;; $and lets you use a boolean and in the query. Logical and means that all the given expressions should be true for positive match.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; ;; Matches all libraries where :language is "Clojure" and :users is greater than 10
|
||||
;; (monger.collection/find "libraries" { $and [{ :language "Clojure" } { :users { $gt 10 } }] })
|
||||
(defoperator $and)
|
||||
|
||||
;; $or lets you use a boolean or in the query. Logical or means that one of the given expressions should be true for positive match.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; ;; Matches all libraries whose :name is "mongoid" or :language is "Ruby"
|
||||
;; (monger.collection.find "libraries" { $or [ { :name "mongoid" } { :language "Ruby" } ] })
|
||||
(defoperator $or)
|
||||
|
||||
;; @nor lets you use a boolean expression, opposite to "all" in the query (think: neither). Give $nor a list of expressions, all of which should
|
||||
;; be false for positive match.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;;
|
||||
;; (monger.collection/find "libraries" { $nor [{ :language "Clojure" } {:users { $gt 10 } } ]})
|
||||
(defoperator $nor)
|
||||
|
||||
;;
|
||||
;; ATOMIC MODIFIERS
|
||||
;;
|
||||
|
||||
;; $inc increments one or many fields for the given value, otherwise sets the field to value
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "scores" { :_id user-id } { :score 10 } })
|
||||
;; (monger.collection/update "scores" { :_id user-id } { :score 20 :bonus 10 } })
|
||||
(defoperator $inc)
|
||||
|
||||
;; $set sets an existing (or non-existing) field (or set of fields) to value
|
||||
;; $set supports all datatypes.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "things" { :_id oid } { $set { :weight 20.5 } })
|
||||
;; (monger.collection/update "things" { :_id oid } { $set { :weight 20.5 :height 12.5 } })
|
||||
(defoperator $set)
|
||||
|
||||
;; $unset deletes a given field, non-existing fields are ignored.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "things" { :_id oid } { $unset { :weight 1 } })
|
||||
(defoperator $unset)
|
||||
|
||||
;; $rename renames a given field
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (monger.collection/update "things" { :_id oid } { $rename { :old_field_name "new_field_name" } })
|
||||
(defoperator $rename)
|
||||
|
||||
;; $push appends _single_ value to field, if field is an existing array, otherwise sets field to the array [value] if field is not present.
|
||||
;; If field is present but is not an array, an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update "docs" { :_id oid } { $push { :tags "modifiers" } })
|
||||
(defoperator $push)
|
||||
|
||||
;; $pushAll appends each value in value_array to field, if field is an existing array, otherwise sets field to the array value_array
|
||||
;; if field is not present. If field is present but is not an array, an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pushAll { :tags ["mongodb" "docs"] } })
|
||||
(defoperator $pushAll)
|
||||
|
||||
;; $addToSet Adds value to the array only if its not in the array already, if field is an existing array, otherwise sets field to the
|
||||
;; array value if field is not present. If field is present but is not an array, an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(defoperator $addToSet)
|
||||
|
||||
;; $pop removes the last element in an array, if 1 is passed.
|
||||
;; if -1 is passed, removes the first element in an array
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pop { :tags 1 } })
|
||||
;; (mgcol/update coll { :_id oid } { $pop { :tags 1 :categories 1 } })
|
||||
(defoperator $pop)
|
||||
|
||||
;; $pull removes all occurrences of value from field, if field is an array. If field is present but is not an array, an error condition
|
||||
;; is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pull { :measurements 1.2 } })
|
||||
(defoperator $pull)
|
||||
|
||||
;; $pullAll removes all occurrences of each value in value_array from field, if field is an array. If field is present but is not an array
|
||||
;; an error condition is raised.
|
||||
;;
|
||||
;; EXAMPLES:
|
||||
;; (mgcol/update coll { :_id oid } { $pull { :measurements 1.2 } })
|
||||
;; (mgcol/update coll { :_id oid } { $pull { :measurements { $gte 1.2 } } })
|
||||
(defoperator $pullAll)
|
||||
|
||||
(defoperator $bit)
|
||||
138
src/monger/query.clj
Normal file
138
src/monger/query.clj
Normal file
|
|
@ -0,0 +1,138 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.query
|
||||
(:refer-clojure :exclude [select find sort])
|
||||
(:require [monger.core]
|
||||
[monger.internal pagination])
|
||||
(:import [com.mongodb DB DBCollection DBObject DBCursor ReadPreference]
|
||||
[java.util List])
|
||||
(:use [monger conversion operators]))
|
||||
|
||||
|
||||
;;
|
||||
;; Implementation
|
||||
;;
|
||||
|
||||
(def ^{ :dynamic true } *query-collection*)
|
||||
|
||||
;;
|
||||
;; Cursor/chain methods
|
||||
;;
|
||||
;; Monger query is an auxiliary construction that helps to create funciton chains through cursors.
|
||||
;; You can specify several chained actions that will be performed on the certain collection through
|
||||
;; query fields.
|
||||
;;
|
||||
;; Existing query fields:
|
||||
;;
|
||||
;; :fields - selects which fields are returned. The default is all fields. _id is always returned.
|
||||
;; :sort - adds a sort to the query.
|
||||
;; :fields - set of fields to retrieve during query execution
|
||||
;; :skip - Skips the first N results.
|
||||
;; :limit - Returns a maximum of N results.
|
||||
;; :batch-size - limits the nubmer of elements returned in one batch.
|
||||
;; :hint - force Mongo to use a specific index for a query in order to improve performance.
|
||||
;; :snapshot - sses snapshot mode for the query. Snapshot mode assures no duplicates are returned, or objects missed
|
||||
;; which were present at both the start and end of the query's execution (if an object is new during the query, or
|
||||
;; deleted during the query, it may or may not be returned, even with snapshot mode). Note that short query responses
|
||||
;; (less than 1MB) are always effectively snapshotted. Currently, snapshot mode may not be used with sorting or explicit hints.
|
||||
(defn empty-query
|
||||
([]
|
||||
{
|
||||
:query {}
|
||||
:sort {}
|
||||
:fields []
|
||||
:skip 0
|
||||
:limit 0
|
||||
:batch-size 256
|
||||
:hint nil
|
||||
:snapshot false
|
||||
:keywordize-fields true
|
||||
})
|
||||
([^DBCollection coll]
|
||||
(merge (empty-query) { :collection coll })))
|
||||
|
||||
(defn- fields-to-db-object
|
||||
[^List fields]
|
||||
(to-db-object (zipmap fields (repeat 1))))
|
||||
|
||||
(defn exec
|
||||
[{ :keys [collection query fields skip limit sort batch-size hint snapshot read-preference keywordize-fields] :or { limit 0 batch-size 256 skip 0 } }]
|
||||
(let [cursor (doto ^DBCursor (.find ^DBCollection collection (to-db-object query) (fields-to-db-object fields))
|
||||
(.limit limit)
|
||||
(.skip skip)
|
||||
(.sort (to-db-object sort))
|
||||
(.batchSize batch-size)
|
||||
(.hint ^DBObject (to-db-object hint)))]
|
||||
(when snapshot
|
||||
(.snapshot cursor))
|
||||
(when read-preference
|
||||
(.setReadPreference cursor read-preference))
|
||||
(map (fn [x] (from-db-object x keywordize-fields))
|
||||
(seq cursor))))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn find
|
||||
[m query]
|
||||
(merge m { :query query }))
|
||||
|
||||
(defn fields
|
||||
[m flds]
|
||||
(merge m { :fields flds }))
|
||||
|
||||
(defn sort
|
||||
[m srt]
|
||||
(merge m { :sort srt }))
|
||||
|
||||
(defn skip
|
||||
[m ^long n]
|
||||
(merge m { :skip n }))
|
||||
|
||||
(defn limit
|
||||
[m ^long n]
|
||||
(merge m { :limit n }))
|
||||
|
||||
(defn batch-size
|
||||
[m ^long n]
|
||||
(merge m { :batch-size n }))
|
||||
|
||||
(defn hint
|
||||
[m h]
|
||||
(merge m { :hint h }))
|
||||
|
||||
(defn snapshot
|
||||
[m]
|
||||
(merge m { :snapshot true }))
|
||||
|
||||
(defn read-preference
|
||||
[m ^ReadPreference rp]
|
||||
(merge m { :read-preference rp }))
|
||||
|
||||
(defn keywordize-fields
|
||||
[m bool]
|
||||
(merge m { :keywordize-fields bool }))
|
||||
|
||||
(defn paginate
|
||||
[m & { :keys [page per-page] :or { page 1 per-page 10 } }]
|
||||
(merge m { :limit per-page :skip (monger.internal.pagination/offset-for page per-page) }))
|
||||
|
||||
(defmacro with-collection
|
||||
[^String coll & body]
|
||||
`(binding [*query-collection* (if (string? ~coll)
|
||||
(.getCollection ^DB monger.core/*mongodb-database* ~coll)
|
||||
~coll)]
|
||||
(let [query# (-> (empty-query *query-collection*) ~@body)]
|
||||
(exec query#))))
|
||||
|
||||
(defmacro partial-query
|
||||
[& body]
|
||||
`(-> {} ~@body))
|
||||
54
src/monger/result.clj
Normal file
54
src/monger/result.clj
Normal file
|
|
@ -0,0 +1,54 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.result
|
||||
(:import [com.mongodb DBObject WriteResult MapReduceOutput]
|
||||
[clojure.lang IPersistentMap])
|
||||
(:require [monger conversion]))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defprotocol MongoCommandResult
|
||||
(ok? [input] "Returns true if command result is a success")
|
||||
(has-error? [input] "Returns true if command result indicates an error")
|
||||
(updated-existing? [input] "Returns true if command result has `updatedExisting` field set to true"))
|
||||
|
||||
(extend-protocol MongoCommandResult
|
||||
DBObject
|
||||
(ok?
|
||||
[^DBObject result]
|
||||
(.contains [true "true" 1 1.0] (.get result "ok")))
|
||||
(has-error?
|
||||
[^DBObject result]
|
||||
;; yes, this is exactly the logic MongoDB Java driver uses.
|
||||
(> (count (str (.get result "err"))) 0))
|
||||
(updated-existing?
|
||||
[^DBObject result]
|
||||
(let [v ^Boolean (.get result "updatedExisting")]
|
||||
(and v (Boolean/valueOf v))))
|
||||
|
||||
|
||||
WriteResult
|
||||
(ok?
|
||||
[^WriteResult result]
|
||||
(and (not (nil? result)) (ok? (.getLastError result))))
|
||||
(has-error?
|
||||
[^WriteResult result]
|
||||
(has-error? (.getLastError result)))
|
||||
(updated-existing?
|
||||
[^WriteResult result]
|
||||
(updated-existing? (.getLastError result)))
|
||||
|
||||
MapReduceOutput
|
||||
(ok?
|
||||
[^MapReduceOutput result]
|
||||
(ok? ^DBObject (.getRaw result))))
|
||||
116
src/monger/testing.clj
Normal file
116
src/monger/testing.clj
Normal file
|
|
@ -0,0 +1,116 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.testing
|
||||
(:require [monger.collection :as mc]
|
||||
[monger.result :as mr])
|
||||
(:use [monger.internal.fn :only (expand-all expand-all-with) :as fntools])
|
||||
(:import [org.bson.types ObjectId]))
|
||||
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defmacro defcleaner
|
||||
"Defines a fixture function that removes all documents from a collection. If collection is not specified,
|
||||
a conventionally named var will be used. Supposed to be used with clojure.test/use-fixtures but may
|
||||
be useful on its own.
|
||||
|
||||
Examples:
|
||||
|
||||
(defcleaner events) ;; collection name will be taken from the events-collection var
|
||||
(defcleaner people \"accounts\") ;; collection name is given
|
||||
"
|
||||
[entities & coll-name]
|
||||
(let [coll-arg (if coll-name
|
||||
(str (first coll-name))
|
||||
(symbol (str entities "-collection")))
|
||||
fn-name (symbol (str "purge-" entities))]
|
||||
`(defn ~fn-name
|
||||
[f#]
|
||||
(mc/remove ~coll-arg)
|
||||
(f#)
|
||||
(mc/remove ~coll-arg))))
|
||||
|
||||
|
||||
|
||||
(def factories (atom {}))
|
||||
(def defaults (atom {}))
|
||||
|
||||
(defn defaults-for
|
||||
[f-group & { :as attributes }]
|
||||
(swap! defaults (fn [v]
|
||||
(assoc v (name f-group) attributes))))
|
||||
|
||||
(defn factory
|
||||
[f-group f-name & { :as attributes }]
|
||||
(swap! factories (fn [a]
|
||||
(assoc-in a [(name f-group) (name f-name)] attributes))))
|
||||
|
||||
|
||||
(declare build seed)
|
||||
(defn- expand-associate-for-building
|
||||
[f]
|
||||
(let [mt (meta f)
|
||||
[f-group f-name] (f)]
|
||||
(:_id (build f-group f-name))))
|
||||
|
||||
(defn- expand-for-building
|
||||
"Expands functions, treating those with association metadata (see `parent-id` for example) specially"
|
||||
[f]
|
||||
(let [mt (meta f)]
|
||||
(if (:associate-gen mt)
|
||||
(expand-associate-for-building f)
|
||||
(f))))
|
||||
|
||||
(defn- expand-associate-for-seeding
|
||||
[f]
|
||||
(let [mt (meta f)
|
||||
[f-group f-name] (f)]
|
||||
(:_id (seed f-group f-name))))
|
||||
|
||||
(defn- expand-for-seeding
|
||||
"Expands functions, treating those with association metadata (see `parent-id` for example) specially,
|
||||
making sure parent documents are persisted first"
|
||||
[f]
|
||||
(let [mt (meta f)]
|
||||
(if (:associate-gen mt)
|
||||
(expand-associate-for-seeding f)
|
||||
(f))))
|
||||
|
||||
(defn build
|
||||
"Generates a new document and returns it"
|
||||
[f-group f-name & { :as overrides }]
|
||||
(let [d (@defaults (name f-group))
|
||||
attributes (get-in @factories [(name f-group) (name f-name)])
|
||||
merged (merge { :_id (ObjectId.) } d attributes overrides)]
|
||||
(expand-all-with merged expand-for-building)))
|
||||
|
||||
(defn seed
|
||||
"Generates and inserts a new document, then returns it"
|
||||
[f-group f-name & { :as overrides }]
|
||||
(io!
|
||||
(let [d (@defaults (name f-group))
|
||||
attributes (get-in @factories [(name f-group) (name f-name)])
|
||||
merged (merge { :_id (ObjectId.) } d attributes overrides)
|
||||
expanded (expand-all-with merged expand-for-seeding)]
|
||||
(assert (mr/ok? (mc/insert f-group expanded)))
|
||||
expanded)))
|
||||
|
||||
(defn embedded-doc
|
||||
[f-group f-name & { :as overrides }]
|
||||
(fn []
|
||||
(apply build f-group f-name (flatten (vec overrides)))))
|
||||
|
||||
(defn parent-id
|
||||
[f-group f-name]
|
||||
(with-meta (fn []
|
||||
[f-group f-name]) { :associate-gen true :parent-gen true }))
|
||||
|
||||
44
src/monger/util.clj
Normal file
44
src/monger/util.clj
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
;; Copyright (c) 2011-2012 Michael S. Klishin
|
||||
;;
|
||||
;; The use and distribution terms for this software are covered by the
|
||||
;; Eclipse Public License 1.0 (http://opensource.org/licenses/eclipse-1.0.php)
|
||||
;; which can be found in the file epl-v10.html at the root of this distribution.
|
||||
;; By using this software in any fashion, you are agreeing to be bound by
|
||||
;; the terms of this license.
|
||||
;; You must not remove this notice, or any other, from this software.
|
||||
|
||||
(ns monger.util
|
||||
(:import (java.security SecureRandom) (java.math BigInteger) (org.bson.types ObjectId) (com.mongodb DBObject) (clojure.lang IPersistentMap) (java.util Map)))
|
||||
|
||||
;;
|
||||
;; API
|
||||
;;
|
||||
|
||||
(defn ^String random-uuid
|
||||
"Generates a secure random UUID string"
|
||||
[]
|
||||
(.toString (java.util.UUID/randomUUID)))
|
||||
|
||||
(defn ^String random-str
|
||||
"Generates a secure random string"
|
||||
[^long n, ^long num-base]
|
||||
(.toString (new BigInteger n (SecureRandom.)) num-base))
|
||||
|
||||
(defn ^ObjectId object-id
|
||||
"Returns a new BSON object id"
|
||||
[]
|
||||
(ObjectId.))
|
||||
|
||||
(defprotocol GetDocumentId
|
||||
(get-id [input] "Returns document id"))
|
||||
|
||||
(extend-protocol GetDocumentId
|
||||
DBObject
|
||||
(get-id
|
||||
[^DBObject object]
|
||||
(.get object "_id"))
|
||||
|
||||
IPersistentMap
|
||||
(get-id
|
||||
[^IPersistentMap object]
|
||||
(or (:_id object) (object "_id"))))
|
||||
|
|
@ -1,136 +0,0 @@
|
|||
(ns monger.test.aggregation-framework-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")
|
||||
coll "docs"]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/purge-many db [coll])
|
||||
(f)
|
||||
(mc/purge-many db [coll]))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
(deftest test-basic-single-stage-$project-aggregation-no-keywordize
|
||||
(let [batch [{"state" "CA" "quantity" 1 "price" 199.00}
|
||||
{"state" "NY" "quantity" 2 "price" 199.00}
|
||||
{"state" "NY" "quantity" 1 "price" 299.00}
|
||||
{"state" "IL" "quantity" 2 "price" 11.50 }
|
||||
{"state" "CA" "quantity" 2 "price" 2.95 }
|
||||
{"state" "IL" "quantity" 3 "price" 5.50 }]
|
||||
expected #{{"quantity" 1 "state" "CA"}
|
||||
{"quantity" 2 "state" "NY"}
|
||||
{"quantity" 1 "state" "NY"}
|
||||
{"quantity" 2 "state" "IL"}
|
||||
{"quantity" 2 "state" "CA"}
|
||||
{"quantity" 3 "state" "IL"}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= 6 (mc/count db coll)))
|
||||
(let [result (->>
|
||||
(mc/aggregate db coll [{$project {"state" 1 "quantity" 1}}] :keywordize false)
|
||||
(map #(select-keys % ["state" "quantity"]))
|
||||
(set))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-basic-single-stage-$project-aggregation
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:quantity 1 :state "CA"}
|
||||
{:quantity 2 :state "NY"}
|
||||
{:quantity 1 :state "NY"}
|
||||
{:quantity 2 :state "IL"}
|
||||
{:quantity 2 :state "CA"}
|
||||
{:quantity 3 :state "IL"}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= 6 (mc/count db coll)))
|
||||
(let [result (set (map #(select-keys % [:state :quantity])
|
||||
(mc/aggregate db coll [{$project {:state 1 :quantity 1}}])))]
|
||||
(is (= expected result)))))
|
||||
|
||||
|
||||
(deftest test-basic-projection-with-multiplication
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:_id "NY" :subtotal 398.0}
|
||||
{:_id "NY" :subtotal 299.0}
|
||||
{:_id "IL" :subtotal 23.0}
|
||||
{:_id "CA" :subtotal 5.9}
|
||||
{:_id "IL" :subtotal 16.5}
|
||||
{:_id "CA" :subtotal 199.0}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (set (mc/aggregate db coll [{$project {:subtotal {$multiply ["$quantity", "$price"]}
|
||||
:_id "$state"}}]))]
|
||||
(is (= expected result)))))
|
||||
|
||||
|
||||
(deftest test-basic-total-aggregation
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:_id "CA" :total 204.9} {:_id "IL" :total 39.5} {:_id "NY" :total 697.0}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (set (mc/aggregate db coll [{$project {:subtotal {$multiply ["$quantity", "$price"]}
|
||||
:_id 1
|
||||
:state 1}}
|
||||
{$group {:_id "$state"
|
||||
:total {$sum "$subtotal"}}}]))]
|
||||
(is (= expected result)))))
|
||||
|
||||
|
||||
(deftest test-$first-aggregation-operator
|
||||
(let [batch [{:state "CA"}
|
||||
{:state "IL"}]
|
||||
expected "CA"]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (:state (first (mc/aggregate db coll [{$group {:_id 1 :state {$first "$state"}}}])))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-$last-aggregation-operator
|
||||
(let [batch [{:state "CA"}
|
||||
{:state "IL"}]
|
||||
expected "IL"]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (:state (first (mc/aggregate db coll [{$group {:_id 1 :state {$last "$state"}}}])))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-cursor-aggregation
|
||||
(let [batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]
|
||||
expected #{{:quantity 1 :state "CA"}
|
||||
{:quantity 2 :state "NY"}
|
||||
{:quantity 1 :state "NY"}
|
||||
{:quantity 2 :state "IL"}
|
||||
{:quantity 2 :state "CA"}
|
||||
{:quantity 3 :state "IL"}}]
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= 6 (mc/count db coll)))
|
||||
(let [result (set (map #(select-keys % [:state :quantity])
|
||||
(mc/aggregate db coll [{$project {:state 1 :quantity 1}}] :cursor {:batch-size 10})))]
|
||||
(is (= expected result)))))
|
||||
|
||||
(deftest test-explain-aggregate
|
||||
(let [batch [{:state "CA" :price 100}
|
||||
{:state "CA" :price 10}
|
||||
{:state "IL" :price 50}]]
|
||||
(mc/insert-batch db coll batch)
|
||||
(let [result (mc/explain-aggregate db coll [{$match {:state "CA"}}])]
|
||||
(is (:ok result))))))
|
||||
293
test/monger/test/atomic_modifiers.clj
Normal file
293
test/monger/test/atomic_modifiers.clj
Normal file
|
|
@ -0,0 +1,293 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.atomic-modifiers
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject CommandResult$CommandFailure]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-docs purge-things purge-scores)
|
||||
|
||||
|
||||
;;
|
||||
;; $inc
|
||||
;;
|
||||
|
||||
(deftest increment-a-single-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" :score 100 })
|
||||
(mgcol/update coll { :_id oid } { $inc { :score 20 } })
|
||||
(is (= 120 (:score (mgcol/find-map-by-id coll oid))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" })
|
||||
(mgcol/update coll { :_id oid } { $inc { :score 30 } })
|
||||
(is (= 30 (:score (mgcol/find-map-by-id coll oid))))))
|
||||
|
||||
|
||||
(deftest increment-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" :score 100 :bonus 0 })
|
||||
(mgcol/update coll { :_id oid } {$inc { :score 20 :bonus 10 } })
|
||||
(is (= { :_id oid :score 120 :bonus 10 :username "l33r0y" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest increment-and-set-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :username "l33r0y" :score 100 })
|
||||
(mgcol/update coll { :_id oid } { $inc { :score 20 :bonus 10 } })
|
||||
(is (= { :_id oid :score 120 :bonus 10 :username "l33r0y" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; $set
|
||||
;;
|
||||
|
||||
(deftest update-a-single-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update coll { :_id oid } { $set { :weight 20.5 } })
|
||||
(is (= 20.5 (:weight (mgcol/find-map-by-id coll oid [:weight]))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update coll { :_id oid } { $set { :height 17.2 } })
|
||||
(is (= 17.2 (:height (mgcol/find-map-by-id coll oid [:height]))))))
|
||||
|
||||
(deftest update-multiple-existing-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 :height 15.2 })
|
||||
(mgcol/update coll { :_id oid } { $set { :weight 20.5 :height 25.6 } })
|
||||
(is (= { :_id oid :weight 20.5 :height 25.6 } (mgcol/find-map-by-id coll oid [:weight :height])))))
|
||||
|
||||
|
||||
(deftest update-and-set-multiple-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :weight 10.0 })
|
||||
(mgcol/update coll { :_id oid } {$set { :weight 20.5 :height 25.6 } })
|
||||
(is (= { :_id oid :weight 20.5 :height 25.6 } (mgcol/find-map-by-id coll oid [:weight :height])))))
|
||||
|
||||
|
||||
;;
|
||||
;; $unset
|
||||
;;
|
||||
|
||||
(deftest unset-a-single-existing-field-using-$unset-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :title "Document 1" :published true })
|
||||
(mgcol/update coll { :_id oid } { $unset { :published 1 } })
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest unset-multiple-existing-fields-using-$unset-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :title "Document 1" :published true :featured true })
|
||||
(mgcol/update coll { :_id oid } { $unset { :published 1 :featured true } })
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest unsetting-an-unexisting-field-using-$unset-modifier-is-not-considered-an-issue
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert coll { :_id oid :title "Document 1" :published true })
|
||||
(is (mgres/ok? (mgcol/update coll { :_id oid } { $unset { :published 1 :featured true } })))
|
||||
(is (= { :_id oid :title "Document 1" } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $push
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;; this is a common mistake, I leave it here to demonstrate it. You almost never
|
||||
;; actually want to do this! What you really want is to use $pushAll instead of $push. MK.
|
||||
(deftest add-array-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags ["modifiers" "operators"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" ["modifiers" "operators"]] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(mgcol/update coll { :_id oid } { $push { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
;;
|
||||
;; $pushAll
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$pushAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert coll { :_id oid :title title })
|
||||
(mgcol/update coll { :_id oid } { $pushAll { :tags ["mongodb" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "docs"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$pushAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $pushAll { :tags ["modifiers" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers" "docs"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$pushAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb" "docs"] })
|
||||
(mgcol/update coll { :_id oid } { $pushAll { :tags ["modifiers" "docs"] } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "docs" "modifiers" "docs"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $addToSet
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert coll { :_id oid :title title })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["mongodb"] })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(mgcol/update coll { :_id oid } { $addToSet { :tags "modifiers" } })
|
||||
(is (= { :_id oid :title title :tags ["mongodb" "modifiers"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pop
|
||||
;;
|
||||
|
||||
(deftest pop-last-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["products" "apple" "reviews"] })
|
||||
(mgcol/update coll { :_id oid } { $pop { :tags 1 } })
|
||||
(is (= { :_id oid :title title :tags ["products" "apple"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest unshift-first-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["products" "apple" "reviews"] })
|
||||
(mgcol/update coll { :_id oid } { $pop { :tags -1 } })
|
||||
(is (= { :_id oid :title title :tags ["apple" "reviews"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest pop-last-values-from-multiple-arrays-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :tags ["products" "apple" "reviews"] :categories ["apple" "reviews" "drafts"] })
|
||||
(mgcol/update coll { :_id oid } { $pop { :tags 1 :categories 1 } })
|
||||
(is (= { :_id oid :title title :tags ["products" "apple"] :categories ["apple" "reviews"] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pull
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update coll { :_id oid } { $pull { :measurements 1.2 } })
|
||||
(is (= { :_id oid :title title :measurements [1.0 1.1 1.1 1.3 1.0] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier-based-on-a-condition
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update coll { :_id oid } { $pull { :measurements { $gte 1.2 } } })
|
||||
(is (= { :_id oid :title title :measurements [1.0 1.1 1.1 1.0] } (mgcol/find-map-by-id coll oid)))))
|
||||
;;
|
||||
;; $pullAll
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pullAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pullAll modifier removes entries of multiple values in the array"]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0] })
|
||||
(mgcol/update coll { :_id oid } { $pullAll { :measurements [1.0 1.1 1.2] } })
|
||||
(is (= { :_id oid :title title :measurements [1.3] } (mgcol/find-map-by-id coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $rename
|
||||
;;
|
||||
|
||||
(deftest rename-a-single-field-using-$rename-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$rename renames fields"
|
||||
v [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]]
|
||||
(mgcol/insert coll { :_id oid :title title :measurements v })
|
||||
(mgcol/update coll { :_id oid } { $rename { :measurements "results" } })
|
||||
(is (= { :_id oid :title title :results v } (mgcol/find-map-by-id coll oid)))))
|
||||
|
|
@ -1,461 +0,0 @@
|
|||
(ns monger.test.atomic-modifiers-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBObject]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.result :refer [acknowledged?]]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "scores")
|
||||
(f)
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "scores"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
;;
|
||||
;; $inc
|
||||
;;
|
||||
|
||||
(deftest increment-a-single-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y" :score 100})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 20}})
|
||||
(is (= 120 (:score (mc/find-map-by-id db coll oid))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y"})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 30}})
|
||||
(is (= 30 (:score (mc/find-map-by-id db coll oid))))))
|
||||
|
||||
|
||||
(deftest increment-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y" :score 100 :bonus 0})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 20 :bonus 10}})
|
||||
(is (= {:_id oid :score 120 :bonus 10 :username "l33r0y"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest increment-and-set-multiple-existing-fields-using-$inc-modifier
|
||||
(let [coll "scores"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :username "l33r0y" :score 100})
|
||||
(mc/update db coll {:_id oid} {$inc {:score 20 :bonus 10}})
|
||||
(is (= {:_id oid :score 120 :bonus 10 :username "l33r0y"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; $set
|
||||
;;
|
||||
|
||||
(deftest update-a-single-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0})
|
||||
(mc/update db coll {:_id oid} {$set {:weight 20.5}})
|
||||
(is (= 20.5 (:weight (mc/find-map-by-id db coll oid [:weight]))))))
|
||||
|
||||
(deftest set-a-single-non-existing-field-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0})
|
||||
(mc/update db coll {:_id oid} {$set {:height 17.2}})
|
||||
(is (= 17.2 (:height (mc/find-map-by-id db coll oid [:height]))))))
|
||||
|
||||
(deftest update-multiple-existing-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0 :height 15.2})
|
||||
(mc/update db coll {:_id oid} {$set {:weight 20.5 :height 25.6}})
|
||||
(is (= {:_id oid :weight 20.5 :height 25.6}
|
||||
(mc/find-map-by-id db coll oid [:weight :height])))))
|
||||
|
||||
|
||||
(deftest update-and-set-multiple-fields-using-$set-modifier
|
||||
(let [coll "things"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :weight 10.0})
|
||||
(mc/update db coll {:_id oid} {$set {:weight 20.5 :height 25.6}})
|
||||
(is (= {:_id oid :weight 20.5 :height 25.6}
|
||||
(mc/find-map-by-id db coll oid [:weight :height])))))
|
||||
|
||||
|
||||
;;
|
||||
;; $unset
|
||||
;;
|
||||
|
||||
(deftest unset-a-single-existing-field-using-$unset-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :title "Document 1" :published true})
|
||||
(mc/update db coll {:_id oid} {$unset {:published 1}})
|
||||
(is (= {:_id oid :title "Document 1"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest unset-multiple-existing-fields-using-$unset-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :title "Document 1" :published true :featured true})
|
||||
(mc/update db coll {:_id oid} {$unset {:published 1 :featured true}})
|
||||
(is (= {:_id oid :title "Document 1"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest unsetting-an-unexisting-field-using-$unset-modifier-is-not-considered-an-issue
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :title "Document 1" :published true})
|
||||
(is (acknowledged? (mc/update db coll {:_id oid} {$unset {:published 1 :featured true}})))
|
||||
(is (= {:_id oid :title "Document 1"}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $setOnInsert
|
||||
;;
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-non-existing-document
|
||||
(let [coll "docs"
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mc/find-and-modify db coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} {:upsert true})
|
||||
(is (= {:_id oid :lastseen now :firstseen now}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest setOnInsert-in-upsert-for-existing-document
|
||||
(let [coll "docs"
|
||||
before 123
|
||||
now 456
|
||||
oid (ObjectId.)]
|
||||
(mc/insert db coll {:_id oid :firstseen before :lastseen before})
|
||||
(mc/find-and-modify db coll {:_id oid} {$set {:lastseen now} $setOnInsert {:firstseen now}} {:upsert true})
|
||||
(is (= {:_id oid :lastseen now :firstseen before}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;; this is a common mistake, I leave it here to demonstrate it. You almost never
|
||||
;; actually want to do this! What you really want is to use $push with $each instead of $push. MK.
|
||||
(deftest add-array-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags ["modifiers" "operators"]}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" ["modifiers" "operators"]]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push modifier appends value to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(mc/update db coll {:_id oid} {$push {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push $each
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push with $each modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["mongodb" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-values-to-an-existing-array-using-$push-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push with $each modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$push with $each modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb" "docs"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $push + $each (formerly $pushAll)
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$push-and-$each-modifiers
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["mongodb" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$push-and-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$push-and-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pushAll modifier appends multiple values to field"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb" "docs"]})
|
||||
(mc/update db coll {:_id oid} {$push {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $addToSet
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$addToSet-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet modifier appends value to field unless it is already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags "modifiers"}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $addToSet $each
|
||||
;;
|
||||
|
||||
(deftest initialize-an-array-using-$addToSet-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet with $each modifier appends multiple values to field unless they are already there"]
|
||||
(mc/insert db coll {:_id oid :title title})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags {$each ["mongodb" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest add-values-to-an-existing-array-using-$addToSet-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet with $each modifier appends multiple values to field unless they are already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags {$each ["modifiers" "docs"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "modifiers" "docs"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest double-add-value-to-an-existing-array-using-$addToSet-$each-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$addToSet with $each modifier appends multiple values to field unless they are already there"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["mongodb" "docs"]})
|
||||
(mc/update db coll {:_id oid} {$addToSet {:tags {$each ["modifiers" "docs" "operators"]}}})
|
||||
(is (= {:_id oid :title title :tags ["mongodb" "docs" "modifiers" "operators"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
;;
|
||||
;; $pop
|
||||
;;
|
||||
|
||||
(deftest pop-last-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["products" "apple" "reviews"]})
|
||||
(mc/update db coll {:_id oid} {$pop {:tags 1}})
|
||||
(is (= {:_id oid :title title :tags ["products" "apple"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest unshift-first-value-in-the-array-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["products" "apple" "reviews"]})
|
||||
(mc/update db coll {:_id oid} {$pop {:tags -1}})
|
||||
(is (= {:_id oid :title title :tags ["apple" "reviews"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest pop-last-values-from-multiple-arrays-using-$pop-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pop modifier removes last or first value in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :tags ["products" "apple" "reviews"] :categories ["apple" "reviews" "drafts"]})
|
||||
(mc/update db coll {:_id oid} {$pop {:tags 1 :categories 1}})
|
||||
(is (= {:_id oid :title title :tags ["products" "apple"] :categories ["apple" "reviews"]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $pull
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]})
|
||||
(mc/update db coll {:_id oid} {$pull {:measurements 1.2}})
|
||||
(is (= {:_id oid :title title :measurements [1.0 1.1 1.1 1.3 1.0]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pull-modifier-based-on-a-condition
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pull modifier removes all value entries in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]})
|
||||
(mc/update db coll {:_id oid} {$pull {:measurements {$gte 1.2}}})
|
||||
(is (= {:_id oid :title title :measurements [1.0 1.1 1.1 1.0]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
;;
|
||||
;; $pullAll
|
||||
;;
|
||||
|
||||
(deftest remove-all-value-entries-from-array-using-$pullAll-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$pullAll modifier removes entries of multiple values in the array"]
|
||||
(mc/insert db coll {:_id oid :title title :measurements [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]})
|
||||
(mc/update db coll {:_id oid} {$pullAll {:measurements [1.0 1.1 1.2]}})
|
||||
(is (= {:_id oid :title title :measurements [1.3]}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; $rename
|
||||
;;
|
||||
|
||||
(deftest rename-a-single-field-using-$rename-modifier
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
title "$rename renames fields"
|
||||
v [1.0 1.2 1.2 1.2 1.1 1.1 1.2 1.3 1.0]]
|
||||
(mc/insert db coll {:_id oid :title title :measurements v})
|
||||
(mc/update db coll {:_id oid} {$rename {:measurements "results"}})
|
||||
(is (= {:_id oid :title title :results v}
|
||||
(mc/find-map-by-id db coll oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; find-and-modify
|
||||
;;
|
||||
|
||||
(deftest find-and-modify-a-single-document
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}
|
||||
update {$inc {:level 1}}]
|
||||
(mc/insert db coll doc)
|
||||
(let [res (mc/find-and-modify db coll conditions update {:return-new true})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 43})))))
|
||||
|
||||
|
||||
(deftest find-and-modify-remove-a-document
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}
|
||||
conditions {:name "Sophie Bangs"}]
|
||||
(mc/insert db coll doc)
|
||||
(let [res (mc/find-and-modify db coll conditions {} {:remove true})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42}))
|
||||
(is (empty? (mc/find-maps db coll conditions))))))
|
||||
|
||||
|
||||
(deftest find-and-modify-upsert-a-document
|
||||
(testing "case 1"
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc {:_id oid :name "Sophie Bangs" :level 42}]
|
||||
(let [res (mc/find-and-modify db coll doc doc {:upsert true})]
|
||||
(is (empty? res))
|
||||
(is (select-keys (mc/find-map-by-id db coll oid) [:name :level]) (dissoc doc :_id)))))
|
||||
(testing "case 2"
|
||||
(let [coll "docs"
|
||||
query {:name "Sophie Bangs"}
|
||||
doc (merge query {:level 42})]
|
||||
(let [res (mc/find-and-modify db coll query doc {:upsert true :return-new true})]
|
||||
(is (:_id res))
|
||||
(is (select-keys (mc/find-map-by-id db coll (:_id res)) [:name :level]) doc)))))
|
||||
|
||||
|
||||
(deftest find-and-modify-after-sort
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
oid2 (ObjectId.)
|
||||
doc {:name "Sophie Bangs"}
|
||||
doc1 (assoc doc :_id oid :level 42)
|
||||
doc2 (assoc doc :_id oid2 :level 0)]
|
||||
(mc/insert-batch db coll [doc1 doc2])
|
||||
(let [res (mc/find-and-modify db coll doc {$inc {:level 1}} {:sort {:level -1}})]
|
||||
(is (= (select-keys res [:name :level]) {:name "Sophie Bangs" :level 42}))))))
|
||||
19
test/monger/test/authentication.clj
Normal file
19
test/monger/test/authentication.clj
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
(ns monger.test.authentication
|
||||
(:require [monger core util db]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
|
||||
|
||||
(deftest test-authentication-with-valid-credentials
|
||||
;; see ./bin/ci/before_script.sh. MK.
|
||||
(let [username "clojurewerkz/monger"
|
||||
pwd "monger"]
|
||||
(is (monger.core/authenticate "monger-test" username (.toCharArray pwd)))))
|
||||
|
||||
(deftest test-authentication-with-invalid-credentials
|
||||
(let [username "monger"
|
||||
^String pwd (monger.util/random-str 128 32)]
|
||||
(is (not (monger.core/authenticate "monger-test2" username (.toCharArray pwd))))))
|
||||
|
|
@ -1,42 +0,0 @@
|
|||
(ns monger.test.authentication-test
|
||||
(:require [monger util db]
|
||||
[monger.credentials :as mcr]
|
||||
[monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]))
|
||||
|
||||
;;
|
||||
;; Connection via URI
|
||||
;;
|
||||
|
||||
(when-not (System/getenv "CI")
|
||||
(deftest ^{:authentication true} connect-to-mongo-via-uri-without-credentials
|
||||
(let [{:keys [conn db]} (mg/connect-via-uri "mongodb://127.0.0.1/monger-test4")]
|
||||
(is (-> conn .getAddress (.sameHost "127.0.0.1")))))
|
||||
|
||||
(deftest ^{:authentication true} connect-to-mongo-via-uri-with-valid-credentials
|
||||
(let [{:keys [conn db]} (mg/connect-via-uri "mongodb://clojurewerkz%2Fmonger:monger@127.0.0.1/monger-test4")]
|
||||
(is (= "monger-test4" (.getName db)))
|
||||
(is (-> conn .getAddress (.sameHost "127.0.0.1")))
|
||||
(mc/remove db "documents")
|
||||
;; make sure that the database is selected
|
||||
;; and operations get through.
|
||||
(mc/insert db "documents" {:field "value"})
|
||||
(is (= 1 (mc/count db "documents" {}))))))
|
||||
|
||||
(if-let [uri (System/getenv "MONGOHQ_URL")]
|
||||
(deftest ^{:external true :authentication true} connect-to-mongo-via-uri-with-valid-credentials
|
||||
(let [{:keys [conn db]} (mg/connect-via-uri uri)]
|
||||
(is (-> conn .getAddress (.sameHost "127.0.0.1"))))))
|
||||
|
||||
|
||||
;;
|
||||
;; Regular connecton
|
||||
;;
|
||||
|
||||
(deftest ^{:authentication true} test-authentication-with-valid-credentials
|
||||
;; see ./bin/ci/before_script.sh. MK.
|
||||
(doseq [s ["monger-test" "monger-test2" "monger-test3" "monger-test4"]]
|
||||
(let [creds (mcr/create "clojurewerkz/monger" "monger-test" "monger")
|
||||
conn (mg/connect-with-credentials "127.0.0.1" creds)]
|
||||
(mc/remove (mg/get-db conn "monger-test") "documents"))))
|
||||
|
|
@ -1,25 +0,0 @@
|
|||
(ns monger.test.capped-collections-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
|
||||
(defn- megabytes
|
||||
[^long n]
|
||||
(* n 1024 1024))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest test-inserting-into-capped-collection
|
||||
(let [n 1000
|
||||
cname "cached"
|
||||
_ (mc/drop db cname)
|
||||
coll (mc/create db cname {:capped true :size (-> 16 megabytes) :max n})]
|
||||
(is (= cname (.getName coll)))
|
||||
(mc/insert-batch db cname (for [i (range 0 (+ n 100))] {:i i}))
|
||||
(is (= n (mc/count db cname)))
|
||||
;; older elements get replaced by newer ones
|
||||
(is (not (mc/any? db cname {:i 1})))
|
||||
(is (not (mc/any? db cname {:i 5})))
|
||||
(is (not (mc/any? db cname {:i 9})))
|
||||
(is (mc/any? db cname {:i (+ n 80)})))))
|
||||
212
test/monger/test/collection.clj
Normal file
212
test/monger/test/collection.clj
Normal file
|
|
@ -0,0 +1,212 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.collection
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject CommandResult$CommandFailure MapReduceOutput MapReduceCommand MapReduceCommand$OutputType]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[clojure stacktrace]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[monger.js :as js]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
;;
|
||||
;; count, remove
|
||||
;;
|
||||
|
||||
(deftest get-collection-size
|
||||
(let [collection "things"]
|
||||
(is (= 0 (mgcol/count collection)))
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }] )
|
||||
(is (= 4 (mgcol/count collection)))
|
||||
(is (mgcol/any? collection))
|
||||
(is (= 3 (mgcol/count monger.core/*mongodb-database* collection { :language "Clojure" })))
|
||||
(is (mgcol/any? monger.core/*mongodb-database* collection { :language "Clojure" }))
|
||||
(is (= 1 (mgcol/count collection { :language "Scala" })))
|
||||
(is (mgcol/any? collection { :language "Scala" }))
|
||||
(is (= 0 (mgcol/count monger.core/*mongodb-database* collection { :language "Python" })))
|
||||
(is (not (mgcol/any? monger.core/*mongodb-database* collection { :language "Python" })))))
|
||||
|
||||
|
||||
(deftest remove-all-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 4 (mgcol/count collection)))
|
||||
(mgcol/remove collection)
|
||||
(is (= 0 (mgcol/count collection)))))
|
||||
|
||||
|
||||
(deftest remove-some-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 4 (mgcol/count collection)))
|
||||
(mgcol/remove collection { :language "Clojure" })
|
||||
(is (= 1 (mgcol/count collection)))))
|
||||
|
||||
(deftest remove-a-single-document-from-collection
|
||||
(let [collection "libraries"
|
||||
oid (ObjectId.)]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure" :name "monger" :_id oid }])
|
||||
(mgcol/remove-by-id collection oid)
|
||||
(is (= 0 (mgcol/count collection)))
|
||||
(is (nil? (mgcol/find-by-id collection oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; indexes
|
||||
;;
|
||||
|
||||
(deftest index-operations
|
||||
(let [collection "libraries"]
|
||||
(mgcol/drop-indexes collection)
|
||||
(is (= "_id_"
|
||||
(:name (first (mgcol/indexes-on collection)))))
|
||||
(is (nil? (second (mgcol/indexes-on collection))))
|
||||
(mgcol/create-index collection { "language" 1 })
|
||||
(is (= "language_1"
|
||||
(:name (second (mgcol/indexes-on collection)))))
|
||||
(mgcol/drop-index collection "language_1")
|
||||
(is (nil? (second (mgcol/indexes-on collection))))
|
||||
(mgcol/ensure-index collection { "language" 1 })
|
||||
(is (= "language_1"
|
||||
(:name (second (mgcol/indexes-on collection)))))
|
||||
(mgcol/ensure-index collection { "language" 1 })))
|
||||
|
||||
|
||||
;;
|
||||
;; exists?, drop, create
|
||||
;;
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-not-exist
|
||||
(let [collection "widgets"]
|
||||
(mgcol/drop collection)
|
||||
(is (false? (mgcol/exists? collection)))))
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-exist
|
||||
(let [collection "widgets"]
|
||||
(mgcol/drop collection)
|
||||
(mgcol/insert-batch collection [{ :name "widget1" }
|
||||
{ :name "widget2" }])
|
||||
(is (mgcol/exists? collection))
|
||||
(mgcol/drop collection)
|
||||
(is (false? (mgcol/exists? collection)))
|
||||
(mgcol/create "widgets" { :capped true :size 100000 :max 10 })
|
||||
(is (mgcol/exists? collection))
|
||||
(mgcol/rename collection "gadgets")
|
||||
(is (not (mgcol/exists? collection)))
|
||||
(is (mgcol/exists? "gadgets"))
|
||||
(mgcol/drop "gadgets")))
|
||||
|
||||
|
||||
;;
|
||||
;; Map/Reduce
|
||||
;;
|
||||
|
||||
(let [collection "widgets"
|
||||
mapper (js/load-resource "resources/mongo/js/mapfun1.js")
|
||||
reducer "function(key, values) {
|
||||
var result = 0;
|
||||
values.forEach(function(v) { result += v });
|
||||
|
||||
return result;
|
||||
}"
|
||||
batch [{ :state "CA" :quantity 1 :price 199.00 }
|
||||
{ :state "NY" :quantity 2 :price 199.00 }
|
||||
{ :state "NY" :quantity 1 :price 299.00 }
|
||||
{ :state "IL" :quantity 2 :price 11.50 }
|
||||
{ :state "CA" :quantity 2 :price 2.95 }
|
||||
{ :state "IL" :quantity 3 :price 5.50 }]
|
||||
expected [{:_id "CA", :value 204.9} {:_id "IL", :value 39.5} {:_id "NY", :value 697.0}]]
|
||||
(deftest basic-inline-map-reduce-example
|
||||
(mgcol/remove monger.core/*mongodb-database* collection {})
|
||||
(is (mgres/ok? (mgcol/insert-batch collection batch)))
|
||||
(let [output (mgcol/map-reduce collection mapper reducer nil MapReduceCommand$OutputType/INLINE {})
|
||||
results (mgcnv/from-db-object ^DBObject (.results ^MapReduceOutput output) true)]
|
||||
(mgres/ok? output)
|
||||
(is (= expected results))))
|
||||
|
||||
(deftest basic-map-reduce-example-that-replaces-named-collection
|
||||
(mgcol/remove monger.core/*mongodb-database* collection {})
|
||||
(is (mgres/ok? (mgcol/insert-batch collection batch)))
|
||||
(let [output (mgcol/map-reduce collection mapper reducer "mr_outputs" {})
|
||||
results (mgcnv/from-db-object ^DBObject (.results ^MapReduceOutput output) true)]
|
||||
(mgres/ok? output)
|
||||
(is (= 3 (monger.core/count results)))
|
||||
(is (= expected
|
||||
(map #(mgcnv/from-db-object % true) (seq results))))
|
||||
(is (= expected
|
||||
(map #(mgcnv/from-db-object % true) (mgcol/find "mr_outputs"))))
|
||||
(.drop ^MapReduceOutput output)))
|
||||
|
||||
(deftest basic-map-reduce-example-that-merged-results-into-named-collection
|
||||
(mgcol/remove monger.core/*mongodb-database* collection {})
|
||||
(is (mgres/ok? (mgcol/insert-batch collection batch)))
|
||||
(mgcol/map-reduce collection mapper reducer "merged_mr_outputs" MapReduceCommand$OutputType/MERGE {})
|
||||
(is (mgres/ok? (mgcol/insert collection { :state "OR" :price 17.95 :quantity 4 })))
|
||||
(let [output (mgcol/map-reduce collection mapper reducer "merged_mr_outputs" MapReduceCommand$OutputType/MERGE {})]
|
||||
(mgres/ok? output)
|
||||
(is (= 4 (monger.core/count (.results ^MapReduceOutput output))))
|
||||
(is (= ["CA" "IL" "NY" "OR"]
|
||||
(map :_id (mgcol/find-maps "merged_mr_outputs"))))
|
||||
(.drop ^MapReduceOutput output))))
|
||||
|
||||
|
||||
;;
|
||||
;; distinct
|
||||
;;
|
||||
|
||||
(deftest distinct-values
|
||||
(let [collection "widgets"
|
||||
batch [{ :state "CA" :quantity 1 :price 199.00 }
|
||||
{ :state "NY" :quantity 2 :price 199.00 }
|
||||
{ :state "NY" :quantity 1 :price 299.00 }
|
||||
{ :state "IL" :quantity 2 :price 11.50 }
|
||||
{ :state "CA" :quantity 2 :price 2.95 }
|
||||
{ :state "IL" :quantity 3 :price 5.50 }]]
|
||||
(mgcol/insert-batch collection batch)
|
||||
(is (= ["CA" "IL" "NY"] (sort (mgcol/distinct monger.core/*mongodb-database* collection :state {}))))
|
||||
(is (= ["CA" "NY"] (sort (mgcol/distinct collection :state { :price { $gt 100.00 } }))))))
|
||||
|
||||
|
||||
;;
|
||||
;; any?, empty?
|
||||
;;
|
||||
|
||||
(deftest any-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (not (mgcol/any? collection)))))
|
||||
|
||||
(deftest any-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mgcol/insert collection { :language "Clojure", :name "langohr" })]
|
||||
(is (mgcol/any? "things"))
|
||||
(is (mgcol/any? monger.core/*mongodb-database* "things" {:language "Clojure"}))))
|
||||
|
||||
(deftest empty-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (mgcol/empty? collection))
|
||||
(is (mgcol/empty? monger.core/*mongodb-database* collection))))
|
||||
|
||||
(deftest empty-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mgcol/insert collection { :language "Clojure", :name "langohr" })]
|
||||
(is (not (mgcol/empty? "things")))))
|
||||
|
|
@ -1,193 +0,0 @@
|
|||
(ns monger.test.collection-test
|
||||
(:import org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
;;
|
||||
;; count, remove
|
||||
;;
|
||||
|
||||
(deftest get-collection-size
|
||||
(let [collection "things"]
|
||||
(is (= 0 (mc/count db collection)))
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(is (mc/any? db collection))
|
||||
(is (= 3 (mc/count db collection {:language "Clojure"})))
|
||||
(is (mc/any? db collection {:language "Clojure"}))
|
||||
(is (= 1 (mc/count db collection {:language "Scala" })))
|
||||
(is (mc/any? db collection {:language "Scala"}))
|
||||
(is (= 0 (mc/count db collection {:language "Python" })))
|
||||
(is (not (mc/any? db collection {:language "Python"})))))
|
||||
|
||||
|
||||
(deftest remove-all-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(mc/remove db collection)
|
||||
(is (= 0 (mc/count db collection)))))
|
||||
|
||||
|
||||
(deftest remove-some-documents-from-collection
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger"}
|
||||
{:language "Clojure" :name "langohr"}
|
||||
{:language "Clojure" :name "incanter"}
|
||||
{:language "Scala" :name "akka"}])
|
||||
(is (= 4 (mc/count db collection)))
|
||||
(mc/remove db collection {:language "Clojure"})
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest remove-a-single-document-from-collection
|
||||
(let [collection "libraries"
|
||||
oid (ObjectId.)]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger" :_id oid}])
|
||||
(mc/remove-by-id db collection oid)
|
||||
(is (= 0 (mc/count db collection)))
|
||||
(is (nil? (mc/find-by-id db collection oid)))))
|
||||
|
||||
|
||||
;;
|
||||
;; exists?, drop, create
|
||||
;;
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-not-exist
|
||||
(let [collection "widgets"]
|
||||
(mc/drop db collection)
|
||||
(is (false? (mc/exists? db collection)))))
|
||||
|
||||
(deftest checking-for-collection-existence-when-it-does-exist
|
||||
(let [collection "widgets"]
|
||||
(mc/drop db collection)
|
||||
(mc/insert-batch db collection [{:name "widget1"}
|
||||
{:name "widget2"}])
|
||||
(is (mc/exists? db collection))
|
||||
(mc/drop db collection)
|
||||
(is (false? (mc/exists? db collection)))
|
||||
(mc/create db "widgets" {:capped true :size 100000 :max 10})
|
||||
(is (mc/exists? db collection))
|
||||
(mc/rename db collection "gadgets")
|
||||
(is (not (mc/exists? db collection)))
|
||||
(is (mc/exists? db "gadgets"))
|
||||
(mc/drop db "gadgets")))
|
||||
|
||||
;;
|
||||
;; any?, empty?
|
||||
;;
|
||||
|
||||
(deftest test-any-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (not (mc/any? db collection)))))
|
||||
|
||||
(deftest test-any-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mc/insert db collection {:language "Clojure" :name "langohr"})]
|
||||
(is (mc/any? db "things" {:language "Clojure"}))))
|
||||
|
||||
(deftest test-empty-on-empty-collection
|
||||
(let [collection "things"]
|
||||
(is (mc/empty? db collection))))
|
||||
|
||||
(deftest test-empty-on-non-empty-collection
|
||||
(let [collection "things"
|
||||
_ (mc/insert db collection {:language "Clojure" :name "langohr"})]
|
||||
(is (not (mc/empty? db "things")))))
|
||||
|
||||
|
||||
;;
|
||||
;; distinct
|
||||
;;
|
||||
|
||||
(deftest test-distinct-values
|
||||
(let [collection "widgets"
|
||||
batch [{:state "CA" :quantity 1 :price 199.00}
|
||||
{:state "NY" :quantity 2 :price 199.00}
|
||||
{:state "NY" :quantity 1 :price 299.00}
|
||||
{:state "IL" :quantity 2 :price 11.50 }
|
||||
{:state "CA" :quantity 2 :price 2.95 }
|
||||
{:state "IL" :quantity 3 :price 5.50 }]]
|
||||
(mc/insert-batch db collection batch)
|
||||
(is (= ["CA" "IL" "NY"] (sort (mc/distinct db collection :state))))
|
||||
(is (= ["CA" "IL" "NY"] (sort (mc/distinct db collection :state {}))))
|
||||
(is (= ["CA" "NY"] (sort (mc/distinct db collection :state {:price {$gt 100.00}}))))))
|
||||
|
||||
;;
|
||||
;; update
|
||||
;;
|
||||
|
||||
(let [coll "things"
|
||||
batch [{:_id 1 :type "rock" :size "small"}
|
||||
{:_id 2 :type "bed" :size "bed-sized"}
|
||||
{:_id 3 :type "bottle" :size "1.5 liters"}]]
|
||||
|
||||
(deftest test-update
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= "small" (:size (mc/find-one-as-map db coll {:type "rock"}))))
|
||||
(mc/update db coll {:type "rock"} {"$set" {:size "huge"}})
|
||||
(is (= "huge" (:size (mc/find-one-as-map db coll {:type "rock"})))))
|
||||
|
||||
(deftest test-upsert
|
||||
(is (mc/empty? db coll))
|
||||
(mc/upsert db coll {:_id 4} {"$set" {:size "tiny"}})
|
||||
(is (not (mc/empty? db coll)))
|
||||
(mc/upsert db coll {:_id 4} {"$set" {:size "big"}})
|
||||
(is (= [{:_id 4 :size "big"}] (mc/find-maps db coll {:_id 4}))))
|
||||
|
||||
(deftest test-update-by-id
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= "bed" (:type (mc/find-one-as-map db coll {:_id 2}))))
|
||||
(mc/update-by-id db coll 2 {"$set" {:type "living room"}})
|
||||
(is (= "living room" (:type (mc/find-one-as-map db coll {:_id 2})))))
|
||||
|
||||
(deftest test-update-by-ids
|
||||
(mc/insert-batch db coll batch)
|
||||
(is (= "bed" (:type (mc/find-one-as-map db coll {:_id 2}))))
|
||||
(is (= "bottle" (:type (mc/find-one-as-map db coll {:_id 3}))))
|
||||
(mc/update-by-ids db coll [2 3] {"$set" {:type "dog"}})
|
||||
(is (= "dog" (:type (mc/find-one-as-map db coll {:_id 2}))))
|
||||
(is (= "dog" (:type (mc/find-one-as-map db coll {:_id 3}))))))
|
||||
|
||||
;;
|
||||
;; miscellenous
|
||||
;;
|
||||
|
||||
(deftest test-system-collection-predicate
|
||||
(are [name] (is (mc/system-collection? name))
|
||||
"system.indexes"
|
||||
"system"
|
||||
;; we treat default GridFS collections as system ones,
|
||||
;; possibly this is a bad idea, time will tell. MK.
|
||||
"fs.chunks"
|
||||
"fs.files")
|
||||
(are [name] (is (not (mc/system-collection? name)))
|
||||
"events"
|
||||
"accounts"
|
||||
"megacorp_account"
|
||||
"myapp_development")))
|
||||
44
test/monger/test/command.clj
Normal file
44
test/monger/test/command.clj
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
(ns monger.test.command
|
||||
(:require [monger core command]
|
||||
[monger.test.helper :as helper]
|
||||
[monger.collection :as mgcol])
|
||||
(:import (com.mongodb Mongo DB CommandResult))
|
||||
(:use [clojure.test]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
|
||||
(deftest test-db-stats
|
||||
(let [stats (monger.command/db-stats)]
|
||||
(is (monger.result/ok? stats))
|
||||
(is (= "monger-test" (get stats "db")))))
|
||||
|
||||
(deftest test-collection-stats
|
||||
(let [collection "stat_test"
|
||||
_ (mgcol/insert collection { :name "Clojure" })
|
||||
check (mgcol/count collection)
|
||||
stats (monger.command/collection-stats collection)]
|
||||
(is (monger.result/ok? stats))
|
||||
(is (= "monger-test.stat_test" (get stats "ns")))
|
||||
(is (= check (get stats "count")))))
|
||||
|
||||
|
||||
(deftest test-reindex-collection
|
||||
(let [_ (mgcol/insert "test" { :name "Clojure" })
|
||||
result (monger.command/reindex-collection "test")]
|
||||
(is (monger.result/ok? result))
|
||||
(is (get result "indexes"))))
|
||||
|
||||
|
||||
(deftest test-server-status
|
||||
(let [status (monger.command/server-status)]
|
||||
(is (monger.result/ok? status))
|
||||
(is (not-empty status))
|
||||
(is (get status "serverUsed"))))
|
||||
|
||||
|
||||
(deftest test-top
|
||||
(let [result (monger.command/top)]
|
||||
(is (monger.result/ok? result))
|
||||
(is (not-empty result))
|
||||
(is (get result "serverUsed"))))
|
||||
|
|
@ -1,29 +0,0 @@
|
|||
(ns monger.test.command-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.command :as mcom]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.result :refer [acknowledged?]]
|
||||
[monger.conversion :refer [from-db-object]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest ^{:command true} test-reindex-collection
|
||||
(let [_ (mc/insert db "test" {:name "Clojure"})
|
||||
result (mcom/reindex-collection db "test")]
|
||||
(is (acknowledged? result))))
|
||||
|
||||
(deftest ^{:command true} test-server-status
|
||||
(let [status (mcom/server-status db)]
|
||||
(is (acknowledged? status))
|
||||
(is (not-empty status))))
|
||||
|
||||
(deftest ^{:command true} test-top
|
||||
(let [result (mcom/top conn)]
|
||||
(is (acknowledged? result))
|
||||
(is (not-empty result))))
|
||||
|
||||
(deftest ^{:command true} test-running-is-master-as-an-arbitrary-command
|
||||
(let [raw (mg/command db {:isMaster 1})
|
||||
result (from-db-object raw true)]
|
||||
(is (acknowledged? raw)))))
|
||||
|
|
@ -1,11 +1,10 @@
|
|||
(ns monger.test.conversion-test
|
||||
(ns monger.test.conversion
|
||||
(:require [monger core collection]
|
||||
[clojure.test :refer :all]
|
||||
[monger.conversion :refer :all])
|
||||
[monger.conversion :as cnv])
|
||||
(:import [com.mongodb DBObject BasicDBObject BasicDBList]
|
||||
[java.util Date Calendar List ArrayList]
|
||||
org.bson.types.ObjectId
|
||||
(org.bson.types Decimal128)))
|
||||
[org.bson.types ObjectId])
|
||||
(:use [clojure.test]))
|
||||
|
||||
|
||||
;;
|
||||
|
|
@ -14,53 +13,36 @@
|
|||
|
||||
(deftest convert-nil-to-dbobject
|
||||
(let [input nil
|
||||
output (to-db-object input)]
|
||||
output (cnv/to-db-object input)]
|
||||
(is (nil? output))))
|
||||
|
||||
(deftest convert-integer-to-dbobject
|
||||
(let [input 1
|
||||
output (to-db-object input)]
|
||||
output (cnv/to-db-object input)]
|
||||
(is (= input output))))
|
||||
|
||||
(deftest convert-float-to-dbobject
|
||||
(let [input 11.12
|
||||
output (to-db-object input)]
|
||||
output (cnv/to-db-object input)]
|
||||
(is (= input output))))
|
||||
|
||||
(deftest convert-rationale-to-dbobject
|
||||
(let [input 11/2
|
||||
output (to-db-object input)]
|
||||
(is (= 5.5 output))))
|
||||
|
||||
(deftest convert-string-to-dbobject
|
||||
(let [input "MongoDB"
|
||||
output (to-db-object input)]
|
||||
(is (= input output))))
|
||||
|
||||
(deftest convert-boolean-true-to-dbobject
|
||||
(let [input true
|
||||
output (to-db-object input)]
|
||||
(is (= input output))))
|
||||
|
||||
(deftest convert-boolean-false-to-dbobject
|
||||
(let [input false
|
||||
output (to-db-object input)]
|
||||
output (cnv/to-db-object input)]
|
||||
(is (= input output))))
|
||||
|
||||
|
||||
(deftest convert-map-to-dbobject
|
||||
(let [input { :int 1 :string "Mongo" :float 22.23 :true true :false false }
|
||||
output ^DBObject (to-db-object input)]
|
||||
(let [input { :int 1, :string "Mongo", :float 22.23 }
|
||||
output ^DBObject (cnv/to-db-object input)]
|
||||
(is (= 1 (.get output "int")))
|
||||
(is (= "Mongo" (.get output "string")))
|
||||
(is (= 22.23 (.get output "float")))
|
||||
(is (true? (.get output "true")))
|
||||
(is (false? (.get output "false")))))
|
||||
(is (= 22.23 (.get output "float")))))
|
||||
|
||||
|
||||
(deftest convert-nested-map-to-dbobject
|
||||
(let [input { :int 1, :string "Mongo", :float 22.23, :map { :int 10, :string "Clojure", :float 11.9, :list '(1 "a" :b), :map { :key "value" } } }
|
||||
output ^DBObject (to-db-object input)
|
||||
output ^DBObject (cnv/to-db-object input)
|
||||
inner ^DBObject (.get output "map")]
|
||||
(is (= 10 (.get inner "int")))
|
||||
(is (= "Clojure" (.get inner "string")))
|
||||
|
|
@ -72,19 +54,19 @@
|
|||
;; to obtain _id that was generated. MK.
|
||||
(deftest convert-dbobject-to-dbobject
|
||||
(let [input (BasicDBObject.)
|
||||
output (to-db-object input)]
|
||||
output (cnv/to-db-object input)]
|
||||
(is (= input output))))
|
||||
|
||||
(deftest convert-java-date-to-dbobject
|
||||
(let [date (Date.)
|
||||
input { :int 1, :string "Mongo", :date date }
|
||||
output ^DBObject (to-db-object input)]
|
||||
output ^DBObject (cnv/to-db-object input)]
|
||||
(is (= date (.get output "date")))))
|
||||
|
||||
(deftest convert-java-calendar-instance-to-dbobject
|
||||
(let [date (Calendar/getInstance)
|
||||
input { :int 1, :string "Mongo", :date date }
|
||||
output ^DBObject (to-db-object input)]
|
||||
output ^DBObject (cnv/to-db-object input)]
|
||||
(is (= date (.get output "date")))))
|
||||
|
||||
|
||||
|
|
@ -95,23 +77,16 @@
|
|||
;;
|
||||
|
||||
(deftest convert-nil-from-db-object
|
||||
(is (nil? (from-db-object nil false)))
|
||||
(is (nil? (from-db-object nil true))))
|
||||
(is (nil? (cnv/from-db-object nil false)))
|
||||
(is (nil? (cnv/from-db-object nil true))))
|
||||
|
||||
(deftest convert-integer-from-dbobject
|
||||
(is (= 2 (from-db-object 2 false)))
|
||||
(is (= 2 (from-db-object 2 true))))
|
||||
|
||||
(deftest convert-decimal-from-dbobject
|
||||
(is (= 2.3M (from-db-object (Decimal128. 2.3M) false)))
|
||||
(is (= 2.3M (from-db-object (Decimal128. 2.3M) true)))
|
||||
(is (= 2.3M (from-db-object (Decimal128/parse "2.3") true)))
|
||||
(is (not= 2.32M (from-db-object (Decimal128/parse "2.3") true)))
|
||||
)
|
||||
(is (= 2 (cnv/from-db-object 2 false)))
|
||||
(is (= 2 (cnv/from-db-object 2 true))))
|
||||
|
||||
(deftest convert-float-from-dbobject
|
||||
(is (= 3.3 (from-db-object 3.3 false)))
|
||||
(is (= 3.3 (from-db-object 3.3 true))))
|
||||
(is (= 3.3 (cnv/from-db-object 3.3 false)))
|
||||
(is (= 3.3 (cnv/from-db-object 3.3 true))))
|
||||
|
||||
(deftest convert-flat-db-object-to-map-without-keywordizing
|
||||
(let [name "Michael"
|
||||
|
|
@ -119,21 +94,21 @@
|
|||
input (doto (BasicDBObject.)
|
||||
(.put "name" name)
|
||||
(.put "age" age))
|
||||
output (from-db-object input false)]
|
||||
(is (= output { "name" name, "age" age }))
|
||||
output (cnv/from-db-object input false)]
|
||||
(is (= (output { "name" name, "age" age })))
|
||||
(is (= (output "name") name))
|
||||
(is (nil? (output :name)))
|
||||
(is (= (output "age") age))
|
||||
(is (nil? (output "points")))))
|
||||
|
||||
(deftest convert-flat-db-object-to-map-with-keywordizing
|
||||
(deftest convert-flat-db-object-to-map-without-keywordizing
|
||||
(let [name "Michael"
|
||||
age 26
|
||||
input (doto (BasicDBObject.)
|
||||
(.put "name" name)
|
||||
(.put "age" age))
|
||||
output (from-db-object input true)]
|
||||
(is (= output { :name name, :age age }))
|
||||
output (cnv/from-db-object input true)]
|
||||
(is (= (output { :name name, :age age })))
|
||||
(is (= (output :name) name))
|
||||
(is (nil? (output "name")))
|
||||
(is (= (output :age) age))
|
||||
|
|
@ -151,7 +126,7 @@
|
|||
input (doto (BasicDBObject.)
|
||||
(.put "_id" did)
|
||||
(.put "nested" nested))
|
||||
output (from-db-object input false)]
|
||||
output (cnv/from-db-object input false)]
|
||||
(is (= (output "_id") did))
|
||||
(is (= (-> output (get "nested") (get "int")) 101))
|
||||
(is (= (-> output (get "nested") (get "list")) ["red" "green" "blue"]))
|
||||
|
|
@ -165,18 +140,5 @@
|
|||
|
||||
(deftest test-conversion-to-object-id
|
||||
(let [output (ObjectId. "4efb39370364238a81020502")]
|
||||
(is (= output (to-object-id "4efb39370364238a81020502")))
|
||||
(is (= output (to-object-id output)))))
|
||||
|
||||
|
||||
;;
|
||||
;; Field selector coercion
|
||||
;;
|
||||
|
||||
(deftest test-field-selector-coercion
|
||||
(are [i o] (is (= (from-db-object (as-field-selector i) true) o))
|
||||
[:a :b :c] {:a 1 :b 1 :c 1}
|
||||
'(:a :b :c) {:a 1 :b 1 :c 1}
|
||||
{:a 1 :b 1 :c 1} {:a 1 :b 1 :c 1}
|
||||
{"a" 1 "b" 1 "c" 1} {:a 1 :b 1 :c 1}
|
||||
{:comments 0} {:comments 0}))
|
||||
(is (= output (cnv/to-object-id "4efb39370364238a81020502")))
|
||||
(is (= output (cnv/to-object-id output)))))
|
||||
92
test/monger/test/core.clj
Normal file
92
test/monger/test/core.clj
Normal file
|
|
@ -0,0 +1,92 @@
|
|||
(ns monger.test.core
|
||||
(:require [monger core collection util result]
|
||||
[monger.test.helper :as helper]
|
||||
[monger.collection :as mgcol])
|
||||
(:import [com.mongodb Mongo DB WriteConcern MongoOptions ServerAddress])
|
||||
(:use [clojure.test]
|
||||
[monger.core :only [server-address mongo-options]]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(deftest connect-to-mongo-with-default-host-and-port
|
||||
(let [connection (monger.core/connect)]
|
||||
(is (instance? com.mongodb.Mongo connection))))
|
||||
|
||||
|
||||
(deftest connect-to-mongo-with-default-host-and-explicit-port
|
||||
(let [connection (monger.core/connect { :port 27017 })]
|
||||
(is (instance? com.mongodb.Mongo connection))))
|
||||
|
||||
|
||||
(deftest connect-to-mongo-with-default-port-and-explicit-host
|
||||
(let [connection (monger.core/connect { :host "127.0.0.1" })]
|
||||
(is (instance? com.mongodb.Mongo connection))))
|
||||
|
||||
|
||||
(deftest test-mongo-options-builder
|
||||
(let [max-wait-time (* 1000 60 2)
|
||||
^MongoOptions result (monger.core/mongo-options :connections-per-host 3 :threads-allowed-to-block-for-connection-multiplier 50
|
||||
:max-wait-time max-wait-time :connect-timeout 10 :socket-timeout 10 :socket-keep-alive true
|
||||
:auto-connect-retry true :max-auto-connect-retry-time 0 :safe true
|
||||
:w 1 :w-timeout 20 :fsync true :j true)]
|
||||
(is (= 3 (. result connectionsPerHost)))
|
||||
(is (= 50 (. result threadsAllowedToBlockForConnectionMultiplier)))
|
||||
(is (= max-wait-time (.maxWaitTime result)))
|
||||
(is (= 10 (.connectTimeout result)))
|
||||
(is (= 10 (.socketTimeout result)))
|
||||
(is (.socketKeepAlive result))
|
||||
(is (.autoConnectRetry result))
|
||||
(is (= 0 (.maxAutoConnectRetryTime result)))
|
||||
(is (.safe result))
|
||||
(is (= 1 (.w result)))
|
||||
(is (= 20 (.wtimeout result)))
|
||||
(is (.fsync result))
|
||||
(is (.j result))))
|
||||
|
||||
(deftest test-server-address
|
||||
(let [host "127.0.0.1"
|
||||
port 7878
|
||||
^ServerAddress sa (server-address host port)]
|
||||
(is (= host (.getHost sa)))
|
||||
(is (= port (.getPort sa)))))
|
||||
|
||||
(deftest use-existing-mongo-connection
|
||||
(let [^MongoOptions opts (mongo-options :threads-allowed-to-block-for-connection-multiplier 300)
|
||||
connection (Mongo. "127.0.0.1" opts)]
|
||||
(monger.core/set-connection! connection)
|
||||
(is (= monger.core/*mongodb-connection* connection))))
|
||||
|
||||
(deftest connect-to-mongo-with-extra-options
|
||||
(let [^MongoOptions opts (mongo-options :threads-allowed-to-block-for-connection-multiplier 300)
|
||||
^ServerAddress sa (server-address "127.0.0.1" 27017)]
|
||||
(monger.core/connect! sa opts)))
|
||||
|
||||
|
||||
(deftest get-database
|
||||
(let [connection (monger.core/connect)
|
||||
db (monger.core/get-db connection "monger-test")]
|
||||
(is (instance? com.mongodb.DB db))))
|
||||
|
||||
|
||||
(deftest test-get-db-names
|
||||
(let [dbs (monger.core/get-db-names)]
|
||||
(is (not (empty? dbs)))
|
||||
(is (dbs "monger-test"))))
|
||||
|
||||
(deftest issuing-a-command
|
||||
"Some commands require administrative priviledges or complex data / checks or heavily depend on DB version. They will be ommited here."
|
||||
(let [collection "things"]
|
||||
(doseq [c [{ :profile 1 }
|
||||
{ :listCommands 1 }
|
||||
{ :dbStats 1 }
|
||||
{ :collstats "things" :scale (* 1024 1024) }
|
||||
{ :getLastError 1 }]]
|
||||
(is (monger.result/ok? (monger.core/command c))))))
|
||||
|
||||
(deftest get-last-error
|
||||
(let [connection (monger.core/connect)
|
||||
db (monger.core/get-db connection "monger-test")]
|
||||
(is (monger.result/ok? (monger.core/get-last-error)))
|
||||
(is (monger.result/ok? (monger.core/get-last-error db)))
|
||||
(is (monger.result/ok? (monger.core/get-last-error db WriteConcern/NORMAL)))
|
||||
(is (monger.result/ok? (monger.core/get-last-error db 1 100 true)))))
|
||||
|
|
@ -1,89 +0,0 @@
|
|||
(ns monger.test.core-test
|
||||
(:require [monger util result]
|
||||
[monger.core :as mg :refer [server-address mongo-options]]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all])
|
||||
(:import [com.mongodb MongoClient DB WriteConcern MongoClientOptions ServerAddress]))
|
||||
|
||||
(println (str "Using Clojure version " *clojure-version*))
|
||||
|
||||
(deftest connect-to-mongo-with-default-host-and-port
|
||||
(let [connection (mg/connect)]
|
||||
(is (instance? com.mongodb.MongoClient connection))))
|
||||
|
||||
(deftest connect-and-disconnect
|
||||
(let [conn (mg/connect)]
|
||||
(mg/disconnect conn)))
|
||||
|
||||
(deftest connect-to-mongo-with-default-host-and-explicit-port
|
||||
(let [connection (mg/connect {:port 27017})]
|
||||
(is (instance? com.mongodb.MongoClient connection))))
|
||||
|
||||
|
||||
(deftest connect-to-mongo-with-default-port-and-explicit-host
|
||||
(let [connection (mg/connect {:host "127.0.0.1"})]
|
||||
(is (instance? com.mongodb.MongoClient connection))))
|
||||
|
||||
(deftest test-server-address
|
||||
(let [host "127.0.0.1"
|
||||
port 7878
|
||||
^ServerAddress sa (server-address host port)]
|
||||
(is (= host (.getHost sa)))
|
||||
(is (= port (.getPort sa)))))
|
||||
|
||||
(deftest use-existing-mongo-connection
|
||||
(let [^MongoClientOptions opts (mongo-options {:threads-allowed-to-block-for-connection-multiplier 300})
|
||||
connection (MongoClient. "127.0.0.1" opts)
|
||||
db (mg/get-db connection "monger-test")]
|
||||
(mg/disconnect connection)))
|
||||
|
||||
(deftest connect-to-mongo-with-extra-options
|
||||
(let [^MongoClientOptions opts (mongo-options {:threads-allowed-to-block-for-connection-multiplier 300})
|
||||
^ServerAddress sa (server-address "127.0.0.1" 27017)
|
||||
conn (mg/connect sa opts)]
|
||||
(mg/disconnect conn)))
|
||||
|
||||
|
||||
(deftest get-database
|
||||
(let [connection (mg/connect)
|
||||
db (mg/get-db connection "monger-test")]
|
||||
(is (instance? com.mongodb.DB db))))
|
||||
|
||||
|
||||
(deftest test-get-db-names
|
||||
(let [conn (mg/connect)
|
||||
dbs (mg/get-db-names conn)]
|
||||
(is (not (empty? dbs)))
|
||||
(is (dbs "monger-test"))))
|
||||
|
||||
(deftest monger-options-test
|
||||
(let [opts {:always-use-mbeans true
|
||||
:application-name "app"
|
||||
:connect-timeout 1
|
||||
:connections-per-host 1
|
||||
:cursor-finalizer-enabled true
|
||||
:description "Description"
|
||||
:heartbeat-connect-timeout 1
|
||||
:heartbeat-frequency 1
|
||||
:heartbeat-socket-timeout 1
|
||||
:local-threshold 1
|
||||
:max-connection-idle-time 1
|
||||
:max-connection-life-time 1
|
||||
:max-wait-time 1
|
||||
:min-connections-per-host 1
|
||||
:min-heartbeat-frequency 1
|
||||
:required-replica-set-name "rs"
|
||||
:retry-writes true
|
||||
:server-selection-timeout 1
|
||||
:socket-keep-alive true
|
||||
:socket-timeout 1
|
||||
:ssl-enabled true
|
||||
:ssl-invalid-host-name-allowed true
|
||||
:threads-allowed-to-block-for-connection-multiplier 1
|
||||
:uuid-representation org.bson.UuidRepresentation/STANDARD
|
||||
:write-concern com.mongodb.WriteConcern/JOURNAL_SAFE}]
|
||||
(is (instance? com.mongodb.MongoClientOptions$Builder (mg/mongo-options-builder opts)))))
|
||||
|
||||
(deftest connect-to-uri-without-db-name
|
||||
(let [uri "mongodb://localhost:27017"]
|
||||
(is (thrown? IllegalArgumentException (mg/connect-via-uri uri)))))
|
||||
|
|
@ -1,107 +0,0 @@
|
|||
(ns monger.test.cursor-test
|
||||
(:import [com.mongodb DBCursor DBObject Bytes]
|
||||
[java.util List Map])
|
||||
(:require [monger.core :as mg]
|
||||
[clojure.test :refer :all]
|
||||
[monger.cursor :refer :all]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest make-db-cursor-for-collection
|
||||
(is (= DBCursor
|
||||
(class (make-db-cursor db :docs)))))
|
||||
|
||||
(deftest getting-cursor-options-value
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (isa? (class opts) Map)))
|
||||
(is (= 0 (.getOptions db-cur))) ;;test default value
|
||||
(is (= false (:notimeout opts)))
|
||||
(is (= false (:partial opts)))
|
||||
(is (= false (:awaitdata opts)))
|
||||
(is (= false (:oplogreplay opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))))
|
||||
|
||||
(deftest adding-option-to-cursor
|
||||
(let [db-cur (make-db-cursor db :docs)]
|
||||
(add-option! db-cur :notimeout)
|
||||
(is (= (:notimeout cursor-options)
|
||||
(.getOptions db-cur)))
|
||||
(add-option! db-cur :tailable)
|
||||
(is (= (.getOptions db-cur)
|
||||
(bit-or (:notimeout cursor-options)
|
||||
(:tailable cursor-options))))))
|
||||
|
||||
(deftest remove-option-from-cursor
|
||||
(let [db-cur (make-db-cursor db :docs)]
|
||||
(add-option! db-cur :partial)
|
||||
(add-option! db-cur :awaitdata)
|
||||
;; removing not-set option should not affect result
|
||||
(remove-option! db-cur :notimeout)
|
||||
(is (= (.getOptions db-cur)
|
||||
(bit-or (:partial cursor-options)
|
||||
(:awaitdata cursor-options))))
|
||||
;; removing active option should remove correct value
|
||||
(remove-option! db-cur :awaitdata)
|
||||
(is (= (.getOptions db-cur)
|
||||
(:partial cursor-options)))))
|
||||
|
||||
|
||||
(deftest test-reset-options
|
||||
(let [db-cur (make-db-cursor db :docs)]
|
||||
(add-option! db-cur :partial)
|
||||
(is (= (.getOptions db-cur)
|
||||
(:partial cursor-options)))
|
||||
(is (= 0
|
||||
(int (.getOptions (reset-options db-cur)))))))
|
||||
|
||||
(deftest add-options-with-hashmap
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur {:notimeout true :slaveok true})
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts)))))
|
||||
|
||||
(deftest add-options-with-hashmap-and-remove-option
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur {:notimeout true :slaveok true})
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:slaveok opts)))
|
||||
;;remove key and add another option
|
||||
(add-options db-cur {:partial true :slaveok false})
|
||||
(let [opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:partial opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts))))))
|
||||
|
||||
(deftest add-options-with-list
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur [:notimeout :slaveok])
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= true (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts)))))
|
||||
|
||||
(deftest add-options-with-Bytes
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur Bytes/QUERYOPTION_NOTIMEOUT)
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts)))))
|
||||
|
||||
(deftest add-options-with-one-keyword
|
||||
(let [db-cur (make-db-cursor db :docs)
|
||||
_ (add-options db-cur :notimeout)
|
||||
opts (get-options db-cur)]
|
||||
(is (= true (:notimeout opts)))
|
||||
(is (= false (:slaveok opts)))
|
||||
(is (= false (:tailable opts)))
|
||||
(is (= false (:oplogreplay opts))))))
|
||||
40
test/monger/test/db.clj
Normal file
40
test/monger/test/db.clj
Normal file
|
|
@ -0,0 +1,40 @@
|
|||
(ns monger.test.db
|
||||
(:require [monger core db]
|
||||
[monger.test.helper :as helper]
|
||||
[monger.collection :as mgcol])
|
||||
(:import [com.mongodb Mongo DB]
|
||||
[java.util Set])
|
||||
(:use [clojure.test]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
|
||||
|
||||
(deftest test-add-user
|
||||
(let [username "clojurewerkz/monger!"
|
||||
pwd (.toCharArray "monger!")
|
||||
db-name "monger-test4"]
|
||||
;; use a secondary database here. MK.
|
||||
(monger.core/with-db (monger.core/get-db db-name)
|
||||
(monger.db/add-user username pwd)
|
||||
(is (monger.core/authenticate db-name username pwd)))))
|
||||
|
||||
|
||||
(deftest test-drop-database
|
||||
;; drop a secondary database here. MK.
|
||||
(monger.core/with-db (monger.core/get-db "monger-test3")
|
||||
(let [collection "test"
|
||||
_ (mgcol/insert collection { :name "Clojure" })
|
||||
check (mgcol/count collection)
|
||||
_ (monger.db/drop-db)]
|
||||
(is (= 1 check))
|
||||
(is (not (mgcol/exists? collection)))
|
||||
(is (= 0 (mgcol/count collection))))))
|
||||
|
||||
|
||||
(deftest test-get-collection-names
|
||||
(mgcol/insert "test-1" { :name "Clojure" })
|
||||
(mgcol/insert "test-2" { :name "Clojure" })
|
||||
(let [^Set collections (monger.db/get-collection-names)]
|
||||
(is (.contains collections "test-1"))
|
||||
(is (.contains collections "test-2"))))
|
||||
|
|
@ -1,31 +0,0 @@
|
|||
(ns monger.test.db-test
|
||||
(:require [monger.db :as mdb]
|
||||
[monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all])
|
||||
(:import [com.mongodb Mongo DB]
|
||||
java.util.Set))
|
||||
|
||||
|
||||
;; do not run this test for CI, it complicates matters by messing up
|
||||
;; authentication for some other tests :( MK.
|
||||
(let [conn (mg/connect)]
|
||||
(when-not (System/getenv "CI")
|
||||
(deftest test-drop-database
|
||||
;; drop a secondary database here. MK.
|
||||
(let [db (mg/get-db conn "monger-test3")
|
||||
collection "test"
|
||||
_ (mc/insert db collection {:name "Clojure"})
|
||||
check (mc/count db collection)
|
||||
_ (mdb/drop-db db)]
|
||||
(is (= 1 check))
|
||||
(is (not (mc/exists? db collection)))
|
||||
(is (= 0 (mc/count db collection))))))
|
||||
|
||||
(deftest test-get-collection-names
|
||||
(let [db (mg/get-db conn "monger-test")]
|
||||
(mc/insert db "test-1" {:name "Clojure"})
|
||||
(mc/insert db "test-2" {:name "Clojure"})
|
||||
(let [^Set xs (mdb/get-collection-names db)]
|
||||
(is (.contains xs "test-1"))
|
||||
(is (.contains xs "test-2"))))))
|
||||
121
test/monger/test/factory_dsl.clj
Normal file
121
test/monger/test/factory_dsl.clj
Normal file
|
|
@ -0,0 +1,121 @@
|
|||
(ns monger.test.factory-dsl
|
||||
(:use [clojure.test]
|
||||
[monger testing joda-time]
|
||||
[monger.test.fixtures]
|
||||
[clj-time.core :only [days ago weeks now]])
|
||||
(:require [monger.collection :as mc]
|
||||
[monger.test.helper :as helper])
|
||||
(:import [org.bson.types ObjectId]
|
||||
[org.joda.time DateTime]))
|
||||
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-domains purge-pages)
|
||||
|
||||
|
||||
|
||||
(defaults-for "domains"
|
||||
:ipv6-enabled false)
|
||||
|
||||
(factory "domains" "clojure"
|
||||
:name "clojure.org"
|
||||
:created-at (-> 2 days ago)
|
||||
:embedded [(embedded-doc "pages" "http://clojure.org/lisp")
|
||||
(embedded-doc "pages" "http://clojure.org/jvm_hosted")
|
||||
(embedded-doc "pages" "http://clojure.org/runtime_polymorphism")])
|
||||
|
||||
(factory "domains" "elixir"
|
||||
:name "elixir-lang.org"
|
||||
:created-at (fn [] (now))
|
||||
:topics (fn [] ["programming" "erlang" "beam" "ruby"])
|
||||
:related {
|
||||
:terms (fn [] ["erlang" "python" "ruby"])
|
||||
})
|
||||
|
||||
(factory "pages" "http://clojure.org/rationale"
|
||||
:name "/rationale"
|
||||
:domain-id (parent-id "domains" "clojure"))
|
||||
(factory "pages" "http://clojure.org/jvm_hosted"
|
||||
:name "/jvm_hosted")
|
||||
(factory "pages" "http://clojure.org/runtime_polymorphism"
|
||||
:name "/runtime_polymorphism")
|
||||
(factory "pages" "http://clojure.org/lisp"
|
||||
:name "/lisp")
|
||||
|
||||
(deftest test-building-documents-from-a-factory-case-1
|
||||
(let [t (-> 2 weeks ago)
|
||||
doc (build "domains" "clojure" :created-at t)]
|
||||
(is (:_id doc))
|
||||
(is (= t (:created-at doc)))
|
||||
(is (= "clojure.org" (:name doc)))
|
||||
(is (false? (:ipv6-enabled doc)))))
|
||||
|
||||
(deftest test-building-documents-from-a-factory-case-2
|
||||
(let [oid (ObjectId.)
|
||||
doc (build "domains" "clojure" :_id oid)]
|
||||
(is (= oid (:_id doc)))
|
||||
(is (= "clojure.org" (:name doc)))
|
||||
(is (false? (:ipv6-enabled doc)))))
|
||||
|
||||
(deftest test-building-documents-from-a-factory-case-3
|
||||
(let [oid (ObjectId.)
|
||||
t (-> 3 weeks ago)
|
||||
doc (build "domains" "clojure" :_id oid :created-at t :name "clojurewerkz.org" :ipv6-enabled true)]
|
||||
(is (= oid (:_id doc)))
|
||||
(is (= t (:created-at doc)))
|
||||
(is (= "clojurewerkz.org" (:name doc)))
|
||||
(is (:ipv6-enabled doc))
|
||||
(is (= ["/lisp" "/jvm_hosted" "/runtime_polymorphism"]
|
||||
(vec (map :name (:embedded doc)))))))
|
||||
|
||||
|
||||
(deftest test-building-documents-from-a-factory-case-4
|
||||
(let [doc (build "domains" "elixir")]
|
||||
(is (:_id doc))
|
||||
(is (instance? DateTime (:created-at doc)))
|
||||
(is (= ["erlang" "python" "ruby"] (get-in doc [:related :terms])))
|
||||
(is (= "elixir-lang.org" (:name doc)))
|
||||
(is (not (:ipv6-enabled doc)))))
|
||||
|
||||
(deftest test-building-child-documents-with-a-parent-ref-case-1
|
||||
(let [doc (build "pages" "http://clojure.org/rationale")]
|
||||
(is (:domain-id doc))))
|
||||
|
||||
|
||||
|
||||
(deftest test-seeding-documents-using-a-factory-case-1
|
||||
(is (mc/empty? "domains"))
|
||||
(let [t (-> 2 weeks ago)
|
||||
doc (seed "domains" "clojure" :created-at t)]
|
||||
(is (= 1 (mc/count "domains")))
|
||||
(is (:_id doc))
|
||||
(is (= t (:created-at doc)))
|
||||
(is (= "clojure.org" (:name doc)))
|
||||
(is (false? (:ipv6-enabled doc)))))
|
||||
|
||||
(deftest test-seeding-documents-using-a-factory-case-2
|
||||
(is (mc/empty? "domains"))
|
||||
(let [doc (seed "domains" "elixir")
|
||||
loaded (first (mc/find-maps "domains"))]
|
||||
(is (= 1 (mc/count "domains")))
|
||||
(is (:_id doc))
|
||||
(is (= (:_id doc) (:_id loaded)))
|
||||
(is (instance? DateTime (:created-at loaded)))
|
||||
(is (= ["erlang" "python" "ruby"] (get-in loaded [:related :terms])))
|
||||
(is (= "elixir-lang.org" (:name loaded)))
|
||||
(is (not (:ipv6-enabled loaded)))))
|
||||
|
||||
|
||||
|
||||
(deftest test-seeding-child-documents-with-a-parent-ref-case-1
|
||||
(is (mc/empty? "domains"))
|
||||
(is (mc/empty? "pages"))
|
||||
(let [page (seed "pages" "http://clojure.org/rationale")
|
||||
domain (mc/find-map-by-id "domains" (:domain-id page))]
|
||||
(is (= 1 (mc/count "domains")))
|
||||
(is (= 1 (mc/count "pages")))
|
||||
(is domain)
|
||||
(is (:domain-id page))
|
||||
(is (= "clojure.org" (:name domain)))
|
||||
(is (= "/rationale" (:name page)))))
|
||||
16
test/monger/test/fixtures.clj
Normal file
16
test/monger/test/fixtures.clj
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
(ns monger.test.fixtures
|
||||
(:require [monger.collection :as mgcol])
|
||||
(:use [monger.testing]))
|
||||
|
||||
;;
|
||||
;; fixture functions
|
||||
;;
|
||||
|
||||
(defcleaner people "people")
|
||||
(defcleaner docs "docs")
|
||||
(defcleaner things "things")
|
||||
(defcleaner libraries "libraries")
|
||||
(defcleaner scores "scores")
|
||||
(defcleaner locations "locations")
|
||||
(defcleaner domains "domains")
|
||||
(defcleaner pages "pages")
|
||||
|
|
@ -1,28 +0,0 @@
|
|||
(ns monger.test.full-text-search-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.command :as cmd]
|
||||
[monger.operators :refer :all]
|
||||
[clojure.test :refer [deftest is use-fixtures]]
|
||||
[monger.result :refer [acknowledged?]])
|
||||
(:import com.mongodb.BasicDBObjectBuilder))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")
|
||||
coll "search-docs"]
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/purge-many db [coll])
|
||||
(f)
|
||||
(mc/purge-many db [coll]))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
(deftest ^{:search true} test-basic-full-text-search-query
|
||||
(mc/ensure-index db coll (array-map :subject "text" :content "text"))
|
||||
(mc/insert db coll {:subject "hello there" :content "this should be searchable"})
|
||||
(mc/insert db coll {:subject "untitled" :content "this is just noize"})
|
||||
(let [xs (mc/find-maps db coll {$text {$search "hello"}})]
|
||||
(is (= 1 (count xs)))
|
||||
(is (= "hello there" (-> xs first :subject))))))
|
||||
130
test/monger/test/gridfs.clj
Normal file
130
test/monger/test/gridfs.clj
Normal file
|
|
@ -0,0 +1,130 @@
|
|||
(ns monger.test.gridfs
|
||||
(:refer-clojure :exclude [count remove find])
|
||||
(:use [clojure.test]
|
||||
[monger.core :only [count]]
|
||||
[monger.test.fixtures]
|
||||
[monger operators conversion]
|
||||
[monger.gridfs :only (store make-input-file)])
|
||||
(:require [monger.gridfs :as gridfs]
|
||||
[monger.test.helper :as helper]
|
||||
[clojure.java.io :as io])
|
||||
(:import [java.io InputStream File FileInputStream]
|
||||
[com.mongodb.gridfs GridFS GridFSInputFile GridFSDBFile]))
|
||||
|
||||
|
||||
(defn purge-gridfs
|
||||
[f]
|
||||
(gridfs/remove-all)
|
||||
(f)
|
||||
(gridfs/remove-all))
|
||||
|
||||
(use-fixtures :each purge-gridfs)
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
|
||||
|
||||
(deftest test-storing-files-to-gridfs-using-relative-fs-paths
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file1")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
|
||||
(deftest test-storing-files-to-gridfs-using-file-instances
|
||||
(let [input (io/as-file "./test/resources/mongo/js/mapfun1.js")]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file2")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
(deftest test-storing-bytes-to-gridfs
|
||||
(let [input (.getBytes "A string")]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file3")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
(deftest test-storing-files-to-gridfs-using-absolute-fs-paths
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-absolute-fs-paths")
|
||||
_ (spit tmp-file "Some content")
|
||||
input (.getAbsolutePath tmp-file)]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file4")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
(deftest test-storing-files-to-gridfs-using-input-stream
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-input-stream")
|
||||
_ (spit tmp-file "Some other content")]
|
||||
(is (= 0 (count (gridfs/all-files))))
|
||||
(store (make-input-file (FileInputStream. tmp-file))
|
||||
(.setFilename "monger.test.gridfs.file4b")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files))))))
|
||||
|
||||
|
||||
|
||||
(deftest test-finding-individual-files-on-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
filename "monger.test.gridfs.file5"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store (make-input-file input)
|
||||
(.setFilename filename)
|
||||
(.setContentType ct))]
|
||||
(is (= 1 (count (gridfs/all-files))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= filename (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(are [a b] (is (= a (:md5 (from-db-object (gridfs/find-one b) true))))
|
||||
md5 (:_id stored)
|
||||
md5 filename
|
||||
md5 (to-db-object { :md5 md5 }))))
|
||||
|
||||
(deftest test-finding-multiple-files-on-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file6")
|
||||
(.setContentType ct))
|
||||
stored2 (store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file7")
|
||||
(.setContentType ct))
|
||||
list1 (gridfs/find "monger.test.gridfs.file6")
|
||||
list2 (gridfs/find "monger.test.gridfs.file7")
|
||||
list3 (gridfs/find "888000___.monger.test.gridfs.file")
|
||||
list4 (gridfs/find { :md5 md5 })]
|
||||
(is (= 2 (count (gridfs/all-files))))
|
||||
(are [a b] (is (= (map #(.get ^GridFSDBFile % "_id") a)
|
||||
(map :_id b)))
|
||||
list1 [stored1]
|
||||
list2 [stored2]
|
||||
list3 []
|
||||
list4 [stored1 stored2])))
|
||||
|
||||
|
||||
(deftest test-removing-multiple-files-from-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file8")
|
||||
(.setContentType ct))
|
||||
stored2 (store (make-input-file input)
|
||||
(.setFilename "monger.test.gridfs.file9")
|
||||
(.setContentType ct))]
|
||||
(is (= 2 (count (gridfs/all-files))))
|
||||
(gridfs/remove { :filename "monger.test.gridfs.file8" })
|
||||
(is (= 1 (count (gridfs/all-files))))
|
||||
(gridfs/remove { :md5 md5 })
|
||||
(is (= 0 (count (gridfs/all-files))))))
|
||||
|
|
@ -1,212 +0,0 @@
|
|||
(ns monger.test.gridfs-test
|
||||
(:refer-clojure :exclude [count remove find])
|
||||
(:require [monger.gridfs :as gridfs]
|
||||
[clojure.java.io :as io]
|
||||
[clojure.test :refer :all]
|
||||
[monger.core :as mg :refer [count]]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.gridfs :refer [store make-input-file store-file filename content-type metadata]])
|
||||
(:import [java.io InputStream File FileInputStream]
|
||||
[com.mongodb.gridfs GridFS GridFSInputFile GridFSDBFile]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")
|
||||
fs (mg/get-gridfs conn "monger-test")]
|
||||
(defn purge-gridfs*
|
||||
[]
|
||||
(gridfs/remove-all fs))
|
||||
|
||||
(defn purge-gridfs
|
||||
[f]
|
||||
(gridfs/remove-all fs)
|
||||
(f)
|
||||
(gridfs/remove-all fs))
|
||||
|
||||
(use-fixtures :each purge-gridfs)
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-relative-fs-paths
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store (make-input-file fs input)
|
||||
(.setFilename "monger.test.gridfs.file1")
|
||||
(.setContentType "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-file-instances
|
||||
(let [input (io/as-file "./test/resources/mongo/js/mapfun1.js")]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file2")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-bytes-to-gridfs
|
||||
(let [input (.getBytes "A string")
|
||||
md {:format "raw" :source "AwesomeCamera D95"}
|
||||
fname "monger.test.gridfs.file3"
|
||||
ct "application/octet-stream"]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(metadata md)
|
||||
(content-type "application/octet-stream"))
|
||||
(let [f (first (gridfs/files-as-maps fs))]
|
||||
(is (= ct (:contentType f)))
|
||||
(is (= fname (:filename f)))
|
||||
(is (= md (:metadata f))))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-absolute-fs-paths
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-absolute-fs-paths")
|
||||
_ (spit tmp-file "Some content")
|
||||
input (.getAbsolutePath tmp-file)]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file4")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-storing-files-to-gridfs-using-input-stream
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-storing-files-to-gridfs-using-input-stream")
|
||||
_ (spit tmp-file "Some other content")]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file fs
|
||||
(make-input-file (FileInputStream. tmp-file))
|
||||
(filename "monger.test.gridfs.file4b")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))))
|
||||
|
||||
(deftest ^{:gridfs true} test-deleting-file-instance-on-disk-after-storing
|
||||
(let [tmp-file (File/createTempFile "monger.test.gridfs" "test-deleting-file-instance-on-disk-after-storing")
|
||||
_ (spit tmp-file "to be deleted")]
|
||||
(is (= 0 (count (gridfs/all-files fs))))
|
||||
(store-file (make-input-file fs tmp-file)
|
||||
(filename "test-deleting-file-instance-on-disk-after-storing")
|
||||
(content-type "application/octet-stream"))
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (.delete tmp-file))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:gridfs true} test-finding-individual-files-on-gridfs
|
||||
(testing "gridfs/find-one"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file5"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(are [a b] (is (= a (:md5 (from-db-object (gridfs/find-one fs b) true))))
|
||||
md5 {:_id (:_id stored)}
|
||||
md5 (to-db-object {:md5 md5}))))
|
||||
(testing "gridfs/find-one-as-map"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file6"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(metadata (to-db-object {:meta "data"}))
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(let [m (gridfs/find-one-as-map fs {:filename fname})]
|
||||
(is (= {:meta "data"} (:metadata m))))
|
||||
(are [a query] (is (= a (:md5 (gridfs/find-one-as-map fs query))))
|
||||
md5 {:_id (:_id stored)}
|
||||
md5 {:md5 md5})))
|
||||
(testing "gridfs/find-by-id"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file5"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(are [a id] (is (= a (:md5 (from-db-object (gridfs/find-by-id fs id) true))))
|
||||
md5 (:_id stored))))
|
||||
(testing "gridfs/find-map-by-id"
|
||||
(purge-gridfs*)
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
fname "monger.test.gridfs.file6"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored (store-file (make-input-file fs input)
|
||||
(filename fname)
|
||||
(metadata (to-db-object {:meta "data"}))
|
||||
(content-type ct))]
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(is (:_id stored))
|
||||
(is (:uploadDate stored))
|
||||
(is (= 62 (:length stored)))
|
||||
(is (= md5 (:md5 stored)))
|
||||
(is (= fname (:filename stored)))
|
||||
(is (= ct (:contentType stored)))
|
||||
(let [m (gridfs/find-map-by-id fs (:_id stored))]
|
||||
(is (= {:meta "data"} (:metadata m))))
|
||||
(are [a id] (is (= a (:md5 (gridfs/find-map-by-id fs id))))
|
||||
md5 (:_id stored)))))
|
||||
|
||||
(deftest ^{:gridfs true} test-finding-multiple-files-on-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file6")
|
||||
(content-type ct))
|
||||
stored2 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file7")
|
||||
(content-type ct))
|
||||
list1 (gridfs/find-by-filename fs "monger.test.gridfs.file6")
|
||||
list2 (gridfs/find-by-filename fs "monger.test.gridfs.file7")
|
||||
list3 (gridfs/find-by-filename fs "888000___.monger.test.gridfs.file")
|
||||
list4 (gridfs/find-by-md5 fs md5)]
|
||||
(is (= 2 (count (gridfs/all-files fs))))
|
||||
(are [a b] (is (= (map #(.get ^GridFSDBFile % "_id") a)
|
||||
(map :_id b)))
|
||||
list1 [stored1]
|
||||
list2 [stored2]
|
||||
list3 []
|
||||
list4 [stored1 stored2])))
|
||||
|
||||
|
||||
(deftest ^{:gridfs true} test-removing-multiple-files-from-gridfs
|
||||
(let [input "./test/resources/mongo/js/mapfun1.js"
|
||||
ct "binary/octet-stream"
|
||||
md5 "14a09deabb50925a3381315149017bbd"
|
||||
stored1 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file8")
|
||||
(content-type ct))
|
||||
stored2 (store-file (make-input-file fs input)
|
||||
(filename "monger.test.gridfs.file9")
|
||||
(content-type ct))]
|
||||
(is (= 2 (count (gridfs/all-files fs))))
|
||||
(gridfs/remove fs { :filename "monger.test.gridfs.file8" })
|
||||
(is (= 1 (count (gridfs/all-files fs))))
|
||||
(gridfs/remove fs { :md5 md5 })
|
||||
(is (= 0 (count (gridfs/all-files fs)))))))
|
||||
17
test/monger/test/helper.clj
Normal file
17
test/monger/test/helper.clj
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
(ns monger.test.helper
|
||||
(:require [monger core util])
|
||||
(:import [com.mongodb WriteConcern]))
|
||||
|
||||
(def connected (atom false))
|
||||
(defn connected?
|
||||
[]
|
||||
@connected)
|
||||
|
||||
(defn connect!
|
||||
[]
|
||||
(when-not (connected?)
|
||||
(do
|
||||
(monger.core/connect!)
|
||||
(monger.core/set-db! (monger.core/get-db "monger-test"))
|
||||
(monger.core/set-default-write-concern! WriteConcern/SAFE)
|
||||
(reset! connected true))))
|
||||
|
|
@ -1,49 +0,0 @@
|
|||
(ns monger.test.indexing-test
|
||||
(:import org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
monger.joda-time
|
||||
[clojure.test :refer :all]
|
||||
[clj-time.core :refer [now seconds ago from-now]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest ^{:indexing true} test-creating-and-dropping-indexes
|
||||
(let [collection "libraries"]
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/create-index db collection {"language" 1})
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on db collection)))))
|
||||
(mc/drop-indexes db collection)
|
||||
(is (nil? (second (mc/indexes-on db collection))))
|
||||
(mc/ensure-index db collection (array-map "language" 1) {:unique true})
|
||||
(is (= "language_1"
|
||||
(:name (second (mc/indexes-on db collection)))))
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/ensure-index db collection (array-map "language" 1))
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/ensure-index db collection (array-map "language" 1) {:unique true})
|
||||
(mc/drop-indexes db collection)
|
||||
(mc/ensure-index db collection (array-map "language" 1) "index-name" true)
|
||||
(mc/drop-indexes db collection)))
|
||||
|
||||
(deftest ^{:indexing true :time-consuming true} test-ttl-collections
|
||||
(let [coll "recent_events"
|
||||
ttl 15
|
||||
sleep 65]
|
||||
(mc/remove db coll)
|
||||
(mc/drop-indexes db coll)
|
||||
(mc/ensure-index db coll (array-map :created-at 1) {:expireAfterSeconds ttl})
|
||||
(dotimes [i 100]
|
||||
(mc/insert db coll {:type "signup" :created-at (-> i seconds ago) :i i}))
|
||||
(dotimes [i 100]
|
||||
(mc/insert db coll {:type "signup" :created-at (-> i seconds from-now) :i i}))
|
||||
(is (= 200 (mc/count db coll {:type "signup"})))
|
||||
;; sleep for > 60 seconds. MongoDB seems to run TTLMonitor once per minute, according to
|
||||
;; the log.
|
||||
(println (format "Now sleeping for %d seconds to test TTL collections!" sleep))
|
||||
(Thread/sleep (* sleep 1000))
|
||||
(println (format "Documents in the TTL collection: %d" (mc/count db coll {:type "signup"})))
|
||||
(is (< (mc/count db coll {:type "signup"}) 100))
|
||||
(mc/remove db coll))))
|
||||
81
test/monger/test/inserting.clj
Normal file
81
test/monger/test/inserting.clj
Normal file
|
|
@ -0,0 +1,81 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.inserting
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
;;
|
||||
;; insert
|
||||
;;
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc { :name "Joe", :age 30 }]
|
||||
(is (monger.result/ok? (mgcol/insert "people" doc)))
|
||||
(is (= 1 (mgcol/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-with-explicitly-passed-database-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc { :name "Joe", :age 30 }]
|
||||
(dotimes [n 5]
|
||||
(is (monger.result/ok? (mgcol/insert monger.core/*mongodb-database* "people" doc WriteConcern/SAFE))))
|
||||
(is (= 5 (mgcol/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
doc { :name "Joe", :age 30 }]
|
||||
(is (monger.result/ok? (mgcol/insert "people" doc WriteConcern/SAFE)))
|
||||
(is (= 1 (mgcol/count collection)))))
|
||||
|
||||
(deftest insert-a-basic-db-object-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc (mgcnv/to-db-object { :name "Joe", :age 30 })]
|
||||
(is (nil? (.get ^DBObject doc "_id")))
|
||||
(mgcol/insert "people" doc)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
(deftest insert-a-map-with-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
id (ObjectId.)
|
||||
doc { :name "Joe", :age 30 "_id" id }
|
||||
result (mgcol/insert "people" doc)]
|
||||
(is (= id (monger.util/get-id doc)))))
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; insert-batch
|
||||
;;
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
docs [{ :name "Joe", :age 30 }, { :name "Paul", :age 27 }]]
|
||||
(is (monger.result/ok? (mgcol/insert-batch "people" docs)))
|
||||
(is (= 2 (mgcol/count collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{ :name "Joe", :age 30 }, { :name "Paul", :age 27 }]]
|
||||
(is (monger.result/ok? (mgcol/insert-batch "people" docs WriteConcern/NORMAL)))
|
||||
(is (= 2 (mgcol/count collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-with-explicit-database-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{ :name "Joe", :age 30 }, { :name "Paul", :age 27 }]]
|
||||
(dotimes [n 44]
|
||||
(is (monger.result/ok? (mgcol/insert-batch monger.core/*mongodb-database* "people" docs WriteConcern/NORMAL))))
|
||||
(is (= 88 (mgcol/count collection)))))
|
||||
|
|
@ -1,180 +0,0 @@
|
|||
(ns monger.test.inserting-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject DBRef]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.util :as mu]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer :all]))
|
||||
|
||||
(defrecord Metrics
|
||||
[rps eps])
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "widgets")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "widgets"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
|
||||
;;
|
||||
;; insert
|
||||
;;
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (mc/insert db collection doc))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-with-explicitly-passed-database-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(dotimes [n 5]
|
||||
(mc/insert db collection doc WriteConcern/SAFE))
|
||||
(is (= 5 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-basic-document-without-id-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}]
|
||||
(is (mc/insert db collection doc WriteConcern/SAFE))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-basic-db-object-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (.get ^DBObject doc "_id")))
|
||||
(mc/insert db collection doc)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
(deftest insert-a-map-with-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
id (ObjectId.)
|
||||
doc {:name "Joe" :age 30 "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= id (monger.util/get-id doc)))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-ratio-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:ratio 11/2 "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= 5.5 (:ratio (mc/find-map-by-id db collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword :kwd "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= (name :kwd) (:keyword (mc/find-map-by-id db collection id))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-keyword-in-a-set-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:keyword1 {:keyword2 #{:kw1 :kw2}} "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= (sort ["kw1" "kw2"])
|
||||
(sort (get-in (mc/find-map-by-id db collection id) [:keyword1 :keyword2]))))))
|
||||
|
||||
(deftest insert-a-document-with-clojure-record-in-it
|
||||
(let [collection "widgets"
|
||||
id (ObjectId.)
|
||||
doc {:record (Metrics. 10 20) "_id" id}
|
||||
result (mc/insert db collection doc)]
|
||||
(is (= {:rps 10 :eps 20} (:record (mc/find-map-by-id db collection id))))))
|
||||
|
||||
;; TODO: disabled until we figure out how to implement dereferencing of DBRefs
|
||||
;; in 3.0 in a compatible way (and if that's possible at all). MK.
|
||||
#_ (deftest test-insert-a-document-with-dbref
|
||||
(mc/remove db "widgets")
|
||||
(mc/remove db "owners")
|
||||
(let [coll1 "widgets"
|
||||
coll2 "owners"
|
||||
oid (ObjectId.)
|
||||
joe (mc/insert db coll2 {:name "Joe" :_id oid})
|
||||
dbref (DBRef. coll2 oid)]
|
||||
(mc/insert db coll1 {:type "pentagon" :owner dbref})
|
||||
(let [fetched (mc/find-one-as-map db coll1 {:type "pentagon"})
|
||||
fo (:owner fetched)]
|
||||
(is (= {:_id oid :name "Joe"} (from-db-object @fo true))))))
|
||||
|
||||
|
||||
;;
|
||||
;; insert-and-return
|
||||
;;
|
||||
|
||||
(deftest insert-and-return-a-basic-document-without-id-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30}
|
||||
result (mc/insert-and-return db collection doc)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-and-return-a-basic-document-without-id-but-with-a-write-concern
|
||||
(let [collection "people"
|
||||
doc {:name "Joe" :age 30 :ratio 3/4}
|
||||
result (mc/insert-and-return db collection doc WriteConcern/FSYNC_SAFE)]
|
||||
(is (= (:name doc)
|
||||
(:name result)))
|
||||
(is (= (:age doc)
|
||||
(:age result)))
|
||||
(is (= (:ratio doc)
|
||||
(:ratio result)))
|
||||
(is (:_id result))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-and-return-with-a-provided-id
|
||||
(let [collection "people"
|
||||
oid (ObjectId.)
|
||||
doc {:name "Joe" :age 30 :_id oid}
|
||||
result (mc/insert-and-return db collection doc)]
|
||||
(is (= (:_id result) (:_id doc) oid))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
|
||||
;;
|
||||
;; insert-batch
|
||||
;;
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-default-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (mc/insert-batch db collection docs))
|
||||
(is (= 2 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(is (mc/insert-batch db collection docs WriteConcern/FSYNCED))
|
||||
(is (= 2 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-with-explicit-database-without-ids-and-with-explicit-write-concern
|
||||
(let [collection "people"
|
||||
docs [{:name "Joe" :age 30} {:name "Paul" :age 27}]]
|
||||
(dotimes [n 44]
|
||||
(is (mc/insert-batch db collection docs WriteConcern/FSYNCED)))
|
||||
(is (= 88 (mc/count db collection)))))
|
||||
|
||||
(deftest insert-a-batch-of-basic-documents-from-a-lazy-sequence
|
||||
(let [collection "people"
|
||||
numbers (range 0 1000)]
|
||||
(is (mc/insert-batch db collection (map (fn [^long l]
|
||||
{:n l})
|
||||
numbers)))
|
||||
(is (= (count numbers) (mc/count db collection))))))
|
||||
45
test/monger/test/internal/fn.clj
Normal file
45
test/monger/test/internal/fn.clj
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
(ns monger.test.internal.fn
|
||||
(:use [clojure.test]
|
||||
[monger.internal.fn]))
|
||||
|
||||
|
||||
(deftest test-expand-all
|
||||
(are [i o] (is (= (expand-all i) o))
|
||||
{ :int (fn [] 1) :str "Clojure" :float (Float/valueOf 11.0) } { :int 1 :str "Clojure" :float (Float/valueOf 11.0 )}
|
||||
{ :long (fn [] (Long/valueOf 11)) } { :long (Long/valueOf 11) }
|
||||
{
|
||||
:i 1
|
||||
:l (Long/valueOf 1111)
|
||||
:s "Clojure"
|
||||
:d (Double/valueOf 11.1)
|
||||
:f (Float/valueOf 2.5)
|
||||
:v [1 2 3]
|
||||
:dyn-v [(fn [] 10) (fn [] 20) (fn [] 30)]
|
||||
:dyn-i (fn [] 1)
|
||||
:dyn-s (fn [] "Clojure (expanded)")
|
||||
:m { :nested "String" }
|
||||
:dyn-m { :abc (fn [] :abc) :nested { :a { :b { :c (fn [] "d") } } } }
|
||||
}
|
||||
{
|
||||
:i 1
|
||||
:l (Long/valueOf 1111)
|
||||
:s "Clojure"
|
||||
:d (Double/valueOf 11.1)
|
||||
:f (Float/valueOf 2.5)
|
||||
:v [1 2 3]
|
||||
:dyn-v [10 20 30]
|
||||
:dyn-i 1
|
||||
:dyn-s "Clojure (expanded)"
|
||||
:m { :nested "String" }
|
||||
:dyn-m {
|
||||
:abc :abc
|
||||
:nested { :a { :b { :c "d" } } }
|
||||
}
|
||||
}))
|
||||
|
||||
(deftest test-expand-all-with
|
||||
(let [expander-fn (fn [f]
|
||||
(* 3 (f)))]
|
||||
(are [i o] (is (= (expand-all-with i expander-fn) o))
|
||||
{ :a 1 :int (fn [] 3) } { :a 1 :int 9 }
|
||||
{ :v [(fn [] 1) (fn [] 11)] :m { :inner (fn [] 3) } :s "Clojure" } { :v [3 33] :m { :inner 9 } :s "Clojure" })))
|
||||
|
|
@ -1,6 +1,6 @@
|
|||
(ns monger.test.internal.pagination-test
|
||||
(:require [clojure.test :refer :all]
|
||||
[monger.internal.pagination :refer :all]))
|
||||
(ns monger.test.internal.pagination
|
||||
(:use [clojure.test]
|
||||
[monger.internal.pagination]))
|
||||
|
||||
(deftest test-pagination-offset
|
||||
(are [a b] (= a b)
|
||||
|
|
@ -1,6 +1,9 @@
|
|||
(ns monger.test.js-test
|
||||
(:require monger.js
|
||||
[clojure.test :refer :all]))
|
||||
(ns monger.test.js
|
||||
(:require [monger js]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(deftest load-js-resource-using-path-on-the-classpath
|
||||
(are [c path] (= c (count (monger.js/load-resource path)))
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
(ns monger.test.json-cheshire-test
|
||||
(:require [clojure.test :refer :all]
|
||||
[monger.json]
|
||||
[cheshire.core :refer :all])
|
||||
(:import org.bson.types.ObjectId
|
||||
org.bson.types.BSONTimestamp))
|
||||
|
||||
(deftest convert-dbobject-to-json
|
||||
(let [input (ObjectId.)
|
||||
output (generate-string input)]
|
||||
(is (= (str "\"" input "\"") output))))
|
||||
|
||||
(deftest convert-bson-timestamp-to-json
|
||||
(let [input (BSONTimestamp. 123 4)
|
||||
output (generate-string input)]
|
||||
(is (= "{\"time\":123,\"inc\":4}" output))))
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
(ns monger.test.json-test
|
||||
(:require [clojure.test :refer :all]
|
||||
[monger.json]
|
||||
[clojure.data.json :as json])
|
||||
(:import org.bson.types.ObjectId
|
||||
org.bson.types.BSONTimestamp))
|
||||
|
||||
(deftest convert-dbobject-to-json
|
||||
(let [input (ObjectId.)
|
||||
output (json/write-str input)]
|
||||
(is (= (str "\"" input "\"") output))))
|
||||
|
||||
(deftest convert-bson-timestamp-to-json
|
||||
(let [input (BSONTimestamp. 123 4)
|
||||
output (json/write-str input)]
|
||||
(is (= "{\"time\":123,\"inc\":4}" output))))
|
||||
32
test/monger/test/lib_integration.clj
Normal file
32
test/monger/test/lib_integration.clj
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
(ns monger.test.lib-integration
|
||||
(:use [clojure.test]
|
||||
[monger.json]
|
||||
[monger.joda-time]
|
||||
[monger.conversion])
|
||||
(:import [org.joda.time DateTime ReadableInstant]
|
||||
[org.joda.time.format ISODateTimeFormat]
|
||||
[java.io StringWriter PrintWriter]
|
||||
[org.bson.types ObjectId]
|
||||
[com.mongodb DBObject])
|
||||
(:require [clojure.data.json :as json]
|
||||
[clj-time.core :as t]))
|
||||
|
||||
|
||||
(deftest serialization-of-joda-datetime-to-json-with-clojure-data-json
|
||||
(is (= "\"2011-10-13T23:55:00.000Z\"" (json/json-str (t/date-time 2011 10 13 23 55 0)))))
|
||||
|
||||
(deftest serialization-of-object-id-to-json-with-clojure-data-json
|
||||
(is (= "\"4ec2d1a6b55634a935ea4ac8\"" (json/json-str (ObjectId. "4ec2d1a6b55634a935ea4ac8")))))
|
||||
|
||||
|
||||
(deftest conversion-of-joda-datetime-to-db-object
|
||||
(let [d (to-db-object (t/date-time 2011 10 13 23 55 0))]
|
||||
(is (instance? java.util.Date d))
|
||||
(is (= 1318550100000 (.getTime ^java.util.Date d)))))
|
||||
|
||||
|
||||
(deftest conversion-of-java-util-date-to-joda-datetime
|
||||
(let [input (.toDate ^DateTime (t/date-time 2011 10 13 23 55 0))
|
||||
output (from-db-object input false)]
|
||||
(is (instance? org.joda.time.DateTime output))
|
||||
(is (= input (.toDate ^DateTime output)))))
|
||||
|
|
@ -1,55 +0,0 @@
|
|||
(ns monger.test.lib-integration-test
|
||||
(:import [org.joda.time DateTime DateMidnight LocalDate]
|
||||
org.bson.types.ObjectId
|
||||
com.mongodb.DBObject)
|
||||
(:require monger.json
|
||||
monger.joda-time
|
||||
[clj-time.core :as t]
|
||||
[cheshire.core :as json]
|
||||
[clojure.test :refer :all]
|
||||
[monger.conversion :refer :all]))
|
||||
|
||||
|
||||
(deftest ^{:integration true} serialization-of-joda-datetime-to-json
|
||||
(let [dt (t/date-time 2011 10 13 23 55 0)]
|
||||
(is (= "\"2011-10-13T23:55:00.000Z\""
|
||||
(json/encode dt)))))
|
||||
|
||||
(deftest ^{:integration true} serialization-of-joda-date-to-json
|
||||
(let [d (.toDate (t/date-time 2011 10 13 23 55 0))]
|
||||
(is (= "\"2011-10-13T23:55:00Z\""
|
||||
(json/encode d)))))
|
||||
|
||||
(deftest ^{:integration true} conversion-of-joda-datetime-to-db-object
|
||||
(let [d (to-db-object (t/date-time 2011 10 13 23 55 0))]
|
||||
(is (instance? java.util.Date d))
|
||||
(is (= 1318550100000 (.getTime ^java.util.Date d)))))
|
||||
|
||||
|
||||
(deftest ^{:integration true} conversion-of-joda-datemidnight-to-db-object
|
||||
(let [d (to-db-object (DateMidnight. (t/date-time 2011 10 13)))]
|
||||
(is (instance? java.util.Date d))
|
||||
(is (= 1318464000000 (.getTime ^java.util.Date d)))))
|
||||
|
||||
(deftest ^{:integration true} conversion-of-joda-localdate-to-db-object
|
||||
(let [d (to-db-object (LocalDate. 2011 10 13))]
|
||||
(is (instance? java.util.Date d))
|
||||
(is (= 111 (.getYear ^java.util.Date d))) ;; how many years since 1900
|
||||
(is (= 9 (.getMonth ^java.util.Date d))) ;; java.util.Date counts from 0
|
||||
(is (= 13 (.getDate ^java.util.Date d)))))
|
||||
|
||||
(deftest ^{:integration true} conversion-of-java-util-date-to-joda-datetime
|
||||
(let [input (.toDate ^DateTime (t/date-time 2011 10 13 23 55 0))
|
||||
output (from-db-object input false)]
|
||||
(is (instance? org.joda.time.DateTime output))
|
||||
(is (= input (.toDate ^DateTime output)))))
|
||||
|
||||
(deftest ^{:integration true} test-reader-extensions
|
||||
(let [^DateTime d (t/date-time 2011 10 13 23 55 0)]
|
||||
(binding [*print-dup* true]
|
||||
(pr-str d))))
|
||||
|
||||
(deftest ^{:integration true} test-reader-extensions-for-localdate
|
||||
(let [^DateTime d (t/today)]
|
||||
(binding [*print-dup* true]
|
||||
(pr-str d))))
|
||||
99
test/monger/test/query_operators.clj
Normal file
99
test/monger/test/query_operators.clj
Normal file
|
|
@ -0,0 +1,99 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.query-operators
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject CommandResult$CommandFailure MapReduceOutput MapReduceCommand MapReduceCommand$OutputType]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[clojure stacktrace]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[monger.js :as js]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(monger.core/connect!)
|
||||
(monger.core/set-db! (monger.core/get-db "monger-test"))
|
||||
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
;;
|
||||
;; $gt, $gte, $lt, lte
|
||||
;;
|
||||
|
||||
(deftest find-with-conditional-operators-comparison
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" :users 1}
|
||||
{ :language "Clojure", :name "langohr" :users 5 }
|
||||
{ :language "Clojure", :name "incanter" :users 15 }
|
||||
{ :language "Scala", :name "akka" :users 150}])
|
||||
(are [a b] (= a (.count (mgcol/find collection b)))
|
||||
2 { :users { $gt 10 }}
|
||||
3 { :users { $gte 5 }}
|
||||
2 { :users { $lt 10 }}
|
||||
2 { :users { $lte 5 }}
|
||||
1 { :users { $gt 10 $lt 150 }})))
|
||||
|
||||
;;
|
||||
;; $ne
|
||||
;;
|
||||
|
||||
(deftest find-with-and-or-operators
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Ruby", :name "mongoid" :users 1}
|
||||
{ :language "Clojure", :name "langohr" :users 5 }
|
||||
{ :language "Clojure", :name "incanter" :users 15 }
|
||||
{ :language "Scala", :name "akka" :users 150}])
|
||||
(is (= 2 (.count (mgcol/find collection {$ne { :language "Clojure" }}))))))
|
||||
|
||||
|
||||
;;
|
||||
;; $and, $or, $nor
|
||||
;;
|
||||
|
||||
(deftest find-with-and-or-operators
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Ruby", :name "mongoid" :users 1}
|
||||
{ :language "Clojure", :name "langohr" :users 5 }
|
||||
{ :language "Clojure", :name "incanter" :users 15 }
|
||||
{ :language "Scala", :name "akka" :users 150}])
|
||||
(is (= 1 (.count (mgcol/find collection {$and [{ :language "Clojure" }
|
||||
{ :users { $gt 10 } }]}))))
|
||||
(is (= 3 (.count (mgcol/find collection {$or [{ :language "Clojure" }
|
||||
{:users { $gt 10 } } ]}))))
|
||||
(is (= 1 (.count (mgcol/find collection {$nor [{ :language "Clojure" }
|
||||
{:users { $gt 10 } } ]}))))))
|
||||
|
||||
;;
|
||||
;; $all, $in, $nin
|
||||
;;
|
||||
|
||||
(deftest find-on-embedded-arrays
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :tags [ "functional" ] }
|
||||
{ :language "Scala", :tags [ "functional" "object-oriented" ] }
|
||||
{ :language "Ruby", :tags [ "object-oriented" "dynamic" ] }])
|
||||
|
||||
(is (= "Scala" (:language (first (mgcol/find-maps collection { :tags { $all [ "functional" "object-oriented" ] } } )))))
|
||||
(is (= 3 (.count (mgcol/find-maps collection { :tags { $in [ "functional" "object-oriented" ] } } ))))
|
||||
(is (= 2 (.count (mgcol/find-maps collection { :language { $in [ "Scala" "Ruby" ] } } ))))
|
||||
(is (= 1 (.count (mgcol/find-maps collection { :tags { $nin [ "dynamic", "object-oriented" ] } } ))))
|
||||
(is (= 3 (.count (mgcol/find-maps collection { :language { $nin [ "C#" ] } } ))))))
|
||||
|
||||
|
||||
(deftest find-with-conditional-operators-on-embedded-documents
|
||||
(let [collection "people"]
|
||||
(mgcol/insert-batch collection [{ :name "Bob", :comments [ { :text "Nice!" :rating 1 }
|
||||
{ :text "Love it" :rating 4 }
|
||||
{ :text "What?":rating -5 } ] }
|
||||
{ :name "Alice", :comments [ { :text "Yeah" :rating 2 }
|
||||
{ :text "Doh" :rating 1 }
|
||||
{ :text "Agreed" :rating 3 }
|
||||
] } ])
|
||||
(are [a b] (= a (.count (mgcol/find collection b)))
|
||||
1 { :comments { $elemMatch { :text "Nice!" :rating { $gte 1 } } } }
|
||||
2 { "comments.rating" 1 }
|
||||
1 { "comments.rating" { $gt 3 } })))
|
||||
|
|
@ -1,146 +0,0 @@
|
|||
(ns monger.test.query-operators-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.js :as js]
|
||||
[clojure.test :refer :all]
|
||||
[clojure.set :refer [difference]]
|
||||
[monger.operators :refer :all])
|
||||
(:import [com.mongodb QueryOperators]))
|
||||
|
||||
;; (use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
(deftest every-query-operator-is-defined
|
||||
(let [driver-query-operators (->> (.getDeclaredFields QueryOperators) (map #(.get % nil)) set)
|
||||
monger-query-operators (->> (ns-publics 'monger.operators) (map (comp name first)) set)
|
||||
; $within is deprecated and replaced by $geoWithin since v2.4.
|
||||
; $uniqueDocs is deprecated since v2.6.
|
||||
deprecated-query-operators #{"$within" "$uniqueDocs"}
|
||||
; Query modifier operators that are deprecated in the mongo shell since v3.2
|
||||
deprecated-meta-operators #{"$comment" "$explain" "$hint" "$maxScan"
|
||||
"$maxTimeMS" "$max" "$min" "$orderby"
|
||||
"$returnKey" "$showDiskLoc" "$snapshot" "$query"}
|
||||
undefined-non-deprecated-operators (difference driver-query-operators
|
||||
deprecated-query-operators
|
||||
deprecated-meta-operators
|
||||
monger-query-operators)]
|
||||
(is (= #{} undefined-non-deprecated-operators))))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "libraries"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
;;
|
||||
;; $gt, $gte, $lt, lte
|
||||
;;
|
||||
|
||||
(deftest find-with-conditional-operators-comparison
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :name "monger" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(are [a b] (= a (.count (mc/find db collection b)))
|
||||
2 {:users {$gt 10}}
|
||||
3 {:users {$gte 5}}
|
||||
2 {:users {$lt 10}}
|
||||
2 {:users {$lte 5}}
|
||||
1 {:users {$gt 10 $lt 150}})))
|
||||
|
||||
;;
|
||||
;; $eq
|
||||
;;
|
||||
|
||||
(deftest find-with-eq-operator
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "mongoid" :users 1 :displayName nil}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 2 (.count (mc/find db collection {:language {$eq "Clojure"}}))))))
|
||||
|
||||
;;
|
||||
;; $ne
|
||||
;;
|
||||
|
||||
(deftest find-with-ne-operator
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "mongoid" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 2 (.count (mc/find db collection {:language {$ne "Clojure"}}))))))
|
||||
|
||||
|
||||
;;
|
||||
;; $and, $or, $nor
|
||||
;;
|
||||
|
||||
(deftest find-with-and-or-operators
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "mongoid" :users 1}
|
||||
{:language "Clojure" :name "langohr" :users 5}
|
||||
{:language "Clojure" :name "incanter" :users 15}
|
||||
{:language "Scala" :name "akka" :users 150}])
|
||||
(is (= 1 (.count (mc/find db collection {$and [{:language "Clojure"}
|
||||
{:users {$gt 10}}]}))))
|
||||
(is (= 3 (.count (mc/find db collection {$or [{:language "Clojure"}
|
||||
{:users {$gt 10}} ]}))))
|
||||
(is (= 1 (.count (mc/find db collection {$nor [{:language "Clojure"}
|
||||
{:users {$gt 10}} ]}))))))
|
||||
|
||||
;;
|
||||
;; $all, $in, $nin
|
||||
;;
|
||||
|
||||
(deftest find-on-embedded-arrays
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Clojure" :tags [ "functional" ]}
|
||||
{:language "Scala" :tags [ "functional" "object-oriented" ]}
|
||||
{:language "Ruby" :tags [ "object-oriented" "dynamic" ]}])
|
||||
|
||||
(is (= "Scala" (:language (first (mc/find-maps db collection {:tags {$all [ "functional" "object-oriented" ]}} )))))
|
||||
(is (= 3 (.count (mc/find-maps db collection {:tags {$in [ "functional" "object-oriented" ]}} ))))
|
||||
(is (= 2 (.count (mc/find-maps db collection {:language {$in [ "Scala" "Ruby" ]}} ))))
|
||||
(is (= 1 (.count (mc/find-maps db collection {:tags {$nin [ "dynamic" "object-oriented" ]}} ))))
|
||||
(is (= 3 (.count (mc/find-maps db collection {:language {$nin [ "C#" ]}} ))))))
|
||||
|
||||
|
||||
(deftest find-with-conditional-operators-on-embedded-documents
|
||||
(let [collection "people"]
|
||||
(mc/insert-batch db collection [{:name "Bob" :comments [{:text "Nice!" :rating 1}
|
||||
{:text "Love it" :rating 4}
|
||||
{:text "What?":rating -5} ]}
|
||||
{:name "Alice" :comments [{:text "Yeah" :rating 2}
|
||||
{:text "Doh" :rating 1}
|
||||
{:text "Agreed" :rating 3}]}])
|
||||
(are [a b] (= a (.count (mc/find db collection b)))
|
||||
1 {:comments {$elemMatch {:text "Nice!" :rating {$gte 1}}}}
|
||||
2 {"comments.rating" 1}
|
||||
1 {"comments.rating" {$gt 3}})))
|
||||
|
||||
(deftest find-with-regex-operator
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{:language "Ruby" :name "Mongoid" :users 1}
|
||||
{:language "Clojure" :name "Langohr" :users 5}
|
||||
{:language "Clojure" :name "Incanter" :users 15}
|
||||
{:language "Scala" :name "Akka" :users 150}])
|
||||
(are [query results] (is (= results (.count (mc/find db collection query))))
|
||||
{:language {$regex "Clo.*"}} 2
|
||||
{:language {$regex "clo.*" $options "i"}} 2
|
||||
{:name {$regex "aK.*" $options "i"}} 1
|
||||
{:language {$regex ".*by"}} 1
|
||||
{:language {$regex ".*ala.*"}} 1)))
|
||||
|
||||
(deftest find-with-js-expression
|
||||
(let [collection "people"]
|
||||
(mc/insert-batch db collection [{:name "Bob" :placeOfBirth "New York" :address {:city "New York"}}
|
||||
{:name "Alice" :placeOfBirth "New York" :address {:city "Los Angeles"}}])
|
||||
(is (= 1 (.count (mc/find db collection {$where "this.placeOfBirth === this.address.city"})))))))
|
||||
282
test/monger/test/querying.clj
Normal file
282
test/monger/test/querying.clj
Normal file
|
|
@ -0,0 +1,282 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.querying
|
||||
(:refer-clojure :exclude [select find sort])
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject CommandResult$CommandFailure ReadPreference]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.test.fixtures]
|
||||
[monger conversion query operators joda-time]
|
||||
[clj-time.core :only [date-time]]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-docs purge-things purge-locations)
|
||||
|
||||
|
||||
;;
|
||||
;; monger.collection/* finders ("low-level API")
|
||||
;;
|
||||
|
||||
;; by ObjectId
|
||||
|
||||
(deftest query-full-document-by-object-id
|
||||
(let [coll "docs"
|
||||
oid (ObjectId.)
|
||||
doc { :_id oid :title "Introducing Monger" }]
|
||||
(mgcol/insert coll doc)
|
||||
(is (= doc (mgcol/find-map-by-id coll oid)))
|
||||
(is (= doc (mgcol/find-one-as-map coll { :_id oid })))))
|
||||
|
||||
|
||||
;; exact match over string field
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-field
|
||||
(let [coll "docs"
|
||||
doc { :title "monger" :language "Clojure" :_id (ObjectId.) }]
|
||||
(mgcol/insert coll doc)
|
||||
(is (= [doc] (mgcol/find-maps coll { :title "monger" })))
|
||||
(is (= doc (from-db-object (first (mgcol/find coll { :title "monger" })) true)))))
|
||||
|
||||
|
||||
;; exact match over string field with limit
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-with-field-with-limit
|
||||
(let [coll "docs"
|
||||
doc1 { :title "monger" :language "Clojure" :_id (ObjectId.) }
|
||||
doc2 { :title "langohr" :language "Clojure" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result (with-collection coll
|
||||
(find { :title "monger" })
|
||||
(fields [:title, :language, :_id])
|
||||
(skip 0)
|
||||
(limit 1))]
|
||||
(is (= 1 (count result)))
|
||||
(is (= [doc1] result))))
|
||||
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-field-with-limit-and-offset
|
||||
(let [coll "docs"
|
||||
doc1 { :title "lucene" :language "Java" :_id (ObjectId.) }
|
||||
doc2 { :title "joda-time" :language "Java" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result (with-collection coll
|
||||
(find { :language "Java" })
|
||||
(skip 1)
|
||||
(limit 2)
|
||||
(sort { :title 1 }))]
|
||||
(is (= 2 (count result)))
|
||||
(is (= [doc1 doc3] result))))
|
||||
|
||||
|
||||
;; < ($lt), <= ($lte), > ($gt), >= ($gte)
|
||||
|
||||
(deftest query-using-dsl-and-$lt-operator-with-integers
|
||||
(let [coll "docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
lt-result (with-collection "docs"
|
||||
(find { :inception_year { $lt 2000 } })
|
||||
(limit 2))]
|
||||
(is (= [doc2] (vec lt-result)))))
|
||||
|
||||
|
||||
(deftest query-using-dsl-and-$lt-operator-with-dates
|
||||
(let [coll "docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
lt-result (with-collection "docs"
|
||||
(find { :inception_year { $lt (date-time 2000 1 2) } })
|
||||
(limit 2))]
|
||||
(is (= (map :_id [doc2])
|
||||
(map :_id (vec lt-result))))))
|
||||
|
||||
(deftest query-using-both-$lte-and-$gte-operators-with-dates
|
||||
(let [coll "docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
lt-result (with-collection "docs"
|
||||
(find { :inception_year { $gt (date-time 2000 1 2) $lte (date-time 2007 2 2) } })
|
||||
(sort { :inception_year 1 }))]
|
||||
(is (= (map :_id [doc3 doc1])
|
||||
(map :_id (vec lt-result))))))
|
||||
|
||||
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-as-strings
|
||||
(let [coll "docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])]
|
||||
(are [doc, result]
|
||||
(= doc, result)
|
||||
(doc2 (with-collection coll
|
||||
(find { :inception_year { "$lt" 2000 } })))
|
||||
(doc2 (with-collection coll
|
||||
(find { :inception_year { "$lte" 1992 } })))
|
||||
(doc1 (with-collection coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 })))
|
||||
(doc1 (with-collection coll
|
||||
(find { :inception_year { "$gte" 2006 } }))))))
|
||||
|
||||
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-using-dsl-composition
|
||||
(let [coll "docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
srt (-> {}
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 }))
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])]
|
||||
(is (= [doc1] (with-collection coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(merge srt))))))
|
||||
|
||||
|
||||
;; $all
|
||||
|
||||
(deftest query-with-using-$all
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
- (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result1 (with-collection coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "homoiconic"] } }))
|
||||
result2 (with-collection coll
|
||||
(find { :tags { "$all" ["functional" "native" "homoiconic"] } }))
|
||||
result3 (with-collection coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "dsls"] } })
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1] result1))
|
||||
(is (empty? result2))
|
||||
(is (= 2 (count result3)))
|
||||
(is (= doc1 (first result3)))))
|
||||
|
||||
|
||||
;; $exists
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$exists
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :published-by "Jill The Blogger" :draft false :title "X announces another Y" }
|
||||
doc2 { :_id (ObjectId.) :draft true :title "Z announces a Y competitor" }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
result1 (mgcol/find-one-as-map coll { :published-by { "$exists" true } })
|
||||
result2 (mgcol/find-one-as-map coll { :published-by { "$exists" false } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))))
|
||||
|
||||
;; $mod
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$mod
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
doc3 { :_id (ObjectId.) :counter 63 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2 doc3])
|
||||
result1 (mgcol/find-one-as-map coll { :counter { "$mod" [10, 5] } })
|
||||
result2 (mgcol/find-one-as-map coll { :counter { "$mod" [10, 2] } })
|
||||
result3 (mgcol/find-one-as-map coll { :counter { "$mod" [11, 1] } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))
|
||||
(is (empty? result3))))
|
||||
|
||||
|
||||
;; $ne
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$ne
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
_ (mgcol/insert-batch coll [doc1 doc2])
|
||||
result1 (mgcol/find-one-as-map coll { :counter { "$ne" 25 } })
|
||||
result2 (mgcol/find-one-as-map coll { :counter { "$ne" 32 } })]
|
||||
(is (= doc2 result1))
|
||||
(is (= doc1 result2))))
|
||||
|
||||
;;
|
||||
;; monger.query DSL features
|
||||
;;
|
||||
|
||||
;; pagination
|
||||
(deftest query-using-pagination-dsl
|
||||
(let [coll "docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc4 { :_id (ObjectId.) :title "Ruby" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc5 { :_id (ObjectId.) :title "Groovy" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc6 { :_id (ObjectId.) :title "OCaml" :tags ["functional" "static" "dsls"] }
|
||||
doc7 { :_id (ObjectId.) :title "Haskell" :tags ["functional" "static" "dsls" "concurrency features"] }
|
||||
- (mgcol/insert-batch coll [doc1 doc2 doc3 doc4 doc5 doc6 doc7])
|
||||
result1 (with-collection coll
|
||||
(find {})
|
||||
(paginate :page 1 :per-page 3)
|
||||
(sort { :title 1 })
|
||||
(read-preference ReadPreference/PRIMARY))
|
||||
result2 (with-collection coll
|
||||
(find {})
|
||||
(paginate :page 2 :per-page 3)
|
||||
(sort { :title 1 }))
|
||||
result3 (with-collection coll
|
||||
(find {})
|
||||
(paginate :page 3 :per-page 3)
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1 doc5 doc7] result1))
|
||||
(is (= [doc2 doc6 doc4] result2))
|
||||
(is (= [doc3] result3))))
|
||||
|
||||
|
||||
(deftest combined-querying-dsl-example1
|
||||
(let [coll "docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mgcol/insert-batch coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc))]
|
||||
(is (= result [ca-doc tx-doc ny-doc]))))
|
||||
|
||||
(deftest combined-querying-dsl-example2
|
||||
(let [coll "docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mgcol/insert-batch coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc)
|
||||
(keywordize-fields false))]
|
||||
;; documents have fields as strings,
|
||||
;; not keywords
|
||||
(is (= (map #(% "name") result)
|
||||
(map #(% :name) [ca-doc tx-doc ny-doc])))))
|
||||
|
|
@ -1,325 +0,0 @@
|
|||
(ns monger.test.querying-test
|
||||
(:refer-clojure :exclude [select find sort])
|
||||
(:import [com.mongodb WriteResult WriteConcern DBObject ReadPreference]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
monger.joda-time
|
||||
[monger.result :as mgres]
|
||||
[clojure.test :refer :all]
|
||||
[monger.conversion :refer :all]
|
||||
[monger.query :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[clj-time.core :refer [date-time]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "locations")
|
||||
(mc/remove db "querying_docs")
|
||||
(f)
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "locations")
|
||||
(mc/remove db "querying_docs"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
;;
|
||||
;; monger.collection/* finders ("low-level API")
|
||||
;;
|
||||
|
||||
;; by ObjectId
|
||||
|
||||
(deftest query-full-document-by-object-id
|
||||
(let [coll "querying_docs"
|
||||
oid (ObjectId.)
|
||||
doc { :_id oid :title "Introducing Monger" }]
|
||||
(mc/insert db coll doc)
|
||||
(is (= doc (mc/find-map-by-id db coll oid)))
|
||||
(is (= doc (mc/find-one-as-map db coll { :_id oid })))))
|
||||
|
||||
|
||||
;; exact match over string field
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-field
|
||||
(let [coll "querying_docs"
|
||||
doc { :title "monger" :language "Clojure" :_id (ObjectId.) }]
|
||||
(mc/insert db coll doc)
|
||||
(is (= [doc] (mc/find-maps db coll { :title "monger" })))
|
||||
(is (= doc (from-db-object (first (mc/find db coll { :title "monger" })) true)))))
|
||||
|
||||
|
||||
;; exact match over string field with limit
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-with-field-with-limit
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :title "monger" :language "Clojure" :_id (ObjectId.) }
|
||||
doc2 { :title "langohr" :language "Clojure" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result (with-collection db coll
|
||||
(find { :title "monger" })
|
||||
(fields [:title, :language, :_id])
|
||||
(skip 0)
|
||||
(limit 1))]
|
||||
(is (= 1 (count result)))
|
||||
(is (= [doc1] result))))
|
||||
|
||||
|
||||
(deftest query-full-document-using-exact-matching-over-string-field-with-limit-and-offset
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :title "lucene" :language "Java" :_id (ObjectId.) }
|
||||
doc2 { :title "joda-time" :language "Java" :_id (ObjectId.) }
|
||||
doc3 { :title "netty" :language "Java" :_id (ObjectId.) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result (with-collection db coll
|
||||
(find { :language "Java" })
|
||||
(skip 1)
|
||||
(limit 2)
|
||||
(sort { :title 1 }))]
|
||||
(is (= 2 (count result)))
|
||||
(is (= [doc1 doc3] result))))
|
||||
|
||||
(deftest query-with-sorting-on-multiple-fields
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :a 1 :b 2 :c 3 :text "Whatever" :_id (ObjectId.) }
|
||||
doc2 { :a 1 :b 1 :c 4 :text "Blah " :_id (ObjectId.) }
|
||||
doc3 { :a 10 :b 3 :c 1 :text "Abc" :_id (ObjectId.) }
|
||||
doc4 { :a 10 :b 3 :c 3 :text "Abc" :_id (ObjectId.) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3 doc4])
|
||||
result1 (with-collection db coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (sorted-map :a 1 :b 1 :text -1)))
|
||||
result2 (with-collection db coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (array-map :c 1 :text -1)))
|
||||
result3 (with-collection db coll
|
||||
(find {})
|
||||
(limit 2)
|
||||
(fields [:a :b :c :text])
|
||||
(sort (array-map :c -1 :text 1)))]
|
||||
(is (= [doc2 doc1] result1))
|
||||
(is (= [doc3 doc1] result2))
|
||||
(is (= [doc2 doc4] result3))))
|
||||
|
||||
|
||||
;; < ($lt), <= ($lte), > ($gt), >= ($gte)
|
||||
|
||||
(deftest query-using-dsl-and-$lt-operator-with-integers
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
lt-result (with-collection db coll
|
||||
(find { :inception_year { $lt 2000 } })
|
||||
(limit 2))]
|
||||
(is (= [doc2] (vec lt-result)))))
|
||||
|
||||
|
||||
(deftest query-using-dsl-and-$lt-operator-with-dates
|
||||
(let [coll "querying_docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
lt-result (with-collection db coll
|
||||
(find { :inception_year { $lt (date-time 2000 1 2) } })
|
||||
(limit 2))]
|
||||
(is (= (map :_id [doc2])
|
||||
(map :_id (vec lt-result))))))
|
||||
|
||||
(deftest query-using-both-$lte-and-$gte-operators-with-dates
|
||||
(let [coll "querying_docs"
|
||||
;; these rely on monger.joda-time being loaded. MK.
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year (date-time 2006 1 1) }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year (date-time 1992 1 2) }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year (date-time 2003 3 3) }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
lt-result (with-collection db coll
|
||||
(find { :inception_year { $gt (date-time 2000 1 2) $lte (date-time 2007 2 2) } })
|
||||
(sort { :inception_year 1 }))]
|
||||
(is (= (map :_id [doc3 doc1])
|
||||
(map :_id (vec lt-result))))))
|
||||
|
||||
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-as-strings
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])]
|
||||
(are [doc, result]
|
||||
(= doc, result)
|
||||
(doc2 (with-collection db coll
|
||||
(find { :inception_year { "$lt" 2000 } })))
|
||||
(doc2 (with-collection db coll
|
||||
(find { :inception_year { "$lte" 1992 } })))
|
||||
(doc1 (with-collection db coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 })))
|
||||
(doc1 (with-collection db coll
|
||||
(find { :inception_year { "$gte" 2006 } }))))))
|
||||
|
||||
|
||||
(deftest query-using-$gt-$lt-$gte-$lte-operators-using-dsl-composition
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :language "Clojure" :_id (ObjectId.) :inception_year 2006 }
|
||||
doc2 { :language "Java" :_id (ObjectId.) :inception_year 1992 }
|
||||
doc3 { :language "Scala" :_id (ObjectId.) :inception_year 2003 }
|
||||
srt (-> {}
|
||||
(limit 1)
|
||||
(sort { :inception_year -1 }))
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])]
|
||||
(is (= [doc1] (with-collection db coll
|
||||
(find { :inception_year { "$gt" 2002 } })
|
||||
(merge srt))))))
|
||||
|
||||
|
||||
;; $all
|
||||
|
||||
(deftest query-with-using-$all
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
- (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result1 (with-collection db coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "homoiconic"] } }))
|
||||
result2 (with-collection db coll
|
||||
(find { :tags { "$all" ["functional" "native" "homoiconic"] } }))
|
||||
result3 (with-collection db coll
|
||||
(find { :tags { "$all" ["functional" "jvm" "dsls"] } })
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1] result1))
|
||||
(is (empty? result2))
|
||||
(is (= 2 (count result3)))
|
||||
(is (= doc1 (first result3)))))
|
||||
|
||||
|
||||
;; $exists
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$exists
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :published-by "Jill The Blogger" :draft false :title "X announces another Y" }
|
||||
doc2 { :_id (ObjectId.) :draft true :title "Z announces a Y competitor" }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
result1 (mc/find-one-as-map db coll { :published-by { "$exists" true } })
|
||||
result2 (mc/find-one-as-map db coll { :published-by { "$exists" false } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))))
|
||||
|
||||
;; $mod
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$mod
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
doc3 { :_id (ObjectId.) :counter 63 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2 doc3])
|
||||
result1 (mc/find-one-as-map db coll { :counter { "$mod" [10, 5] } })
|
||||
result2 (mc/find-one-as-map db coll { :counter { "$mod" [10, 2] } })
|
||||
result3 (mc/find-one-as-map db coll { :counter { "$mod" [11, 1] } })]
|
||||
(is (= doc1 result1))
|
||||
(is (= doc2 result2))
|
||||
(is (empty? result3))))
|
||||
|
||||
|
||||
;; $ne
|
||||
|
||||
(deftest query-with-find-one-as-map-using-$ne
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :counter 25 }
|
||||
doc2 { :_id (ObjectId.) :counter 32 }
|
||||
_ (mc/insert-batch db coll [doc1 doc2])
|
||||
result1 (mc/find-one-as-map db coll { :counter { "$ne" 25 } })
|
||||
result2 (mc/find-one-as-map db coll { :counter { "$ne" 32 } })]
|
||||
(is (= doc2 result1))
|
||||
(is (= doc1 result2))))
|
||||
|
||||
;;
|
||||
;; monger.query DSL features
|
||||
;;
|
||||
|
||||
;; pagination
|
||||
(deftest query-using-pagination-dsl
|
||||
(let [coll "querying_docs"
|
||||
doc1 { :_id (ObjectId.) :title "Clojure" :tags ["functional" "homoiconic" "syntax-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc2 { :_id (ObjectId.) :title "Java" :tags ["object-oriented" "jvm"] }
|
||||
doc3 { :_id (ObjectId.) :title "Scala" :tags ["functional" "object-oriented" "dsls" "concurrency features" "jvm"] }
|
||||
doc4 { :_id (ObjectId.) :title "Ruby" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc5 { :_id (ObjectId.) :title "Groovy" :tags ["dynamic" "object-oriented" "dsls" "jvm"] }
|
||||
doc6 { :_id (ObjectId.) :title "OCaml" :tags ["functional" "static" "dsls"] }
|
||||
doc7 { :_id (ObjectId.) :title "Haskell" :tags ["functional" "static" "dsls" "concurrency features"] }
|
||||
- (mc/insert-batch db coll [doc1 doc2 doc3 doc4 doc5 doc6 doc7])
|
||||
result1 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 1 :per-page 3)
|
||||
(sort { :title 1 })
|
||||
(read-preference (ReadPreference/primary))
|
||||
(options com.mongodb.Bytes/QUERYOPTION_NOTIMEOUT))
|
||||
result2 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 2 :per-page 3)
|
||||
(sort { :title 1 }))
|
||||
result3 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 3 :per-page 3)
|
||||
(sort { :title 1 }))
|
||||
result4 (with-collection db coll
|
||||
(find {})
|
||||
(paginate :page 10 :per-page 3)
|
||||
(sort { :title 1 }))]
|
||||
(is (= [doc1 doc5 doc7] result1))
|
||||
(is (= [doc2 doc6 doc4] result2))
|
||||
(is (= [doc3] result3))
|
||||
(is (empty? result4))))
|
||||
|
||||
|
||||
(deftest combined-querying-dsl-example1
|
||||
(let [coll "querying_docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mc/insert-batch db coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection db coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc))]
|
||||
(is (= result [ca-doc tx-doc ny-doc]))))
|
||||
|
||||
(deftest combined-querying-dsl-example2
|
||||
(let [coll "querying_docs"
|
||||
ma-doc { :_id (ObjectId.) :name "Massachusetts" :iso "MA" :population 6547629 :joined_in 1788 :capital "Boston" }
|
||||
de-doc { :_id (ObjectId.) :name "Delaware" :iso "DE" :population 897934 :joined_in 1787 :capital "Dover" }
|
||||
ny-doc { :_id (ObjectId.) :name "New York" :iso "NY" :population 19378102 :joined_in 1788 :capital "Albany" }
|
||||
ca-doc { :_id (ObjectId.) :name "California" :iso "CA" :population 37253956 :joined_in 1850 :capital "Sacramento" }
|
||||
tx-doc { :_id (ObjectId.) :name "Texas" :iso "TX" :population 25145561 :joined_in 1845 :capital "Austin" }
|
||||
top3 (partial-query (limit 3))
|
||||
by-population-desc (partial-query (sort { :population -1 }))
|
||||
_ (mc/insert-batch db coll [ma-doc de-doc ny-doc ca-doc tx-doc])
|
||||
result (with-collection db coll
|
||||
(find {})
|
||||
(merge top3)
|
||||
(merge by-population-desc)
|
||||
(keywordize-fields false))]
|
||||
;; documents have fields as strings,
|
||||
;; not keywords
|
||||
(is (= (map #(% "name") result)
|
||||
(map #(% :name) [ca-doc tx-doc ny-doc]))))))
|
||||
|
|
@ -1,55 +0,0 @@
|
|||
(ns monger.test.ragtime-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
monger.ragtime
|
||||
[ragtime.protocols :refer :all]
|
||||
[clojure.test :refer :all]))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "meta.migrations")
|
||||
(f)
|
||||
(mc/remove db "meta.migrations"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
(when-not (get (System/getenv) "CI")
|
||||
(deftest test-add-migration-id
|
||||
(let [coll "meta.migrations"
|
||||
key "1"]
|
||||
(mc/remove db coll {})
|
||||
(is (not (mc/any? db coll {:_id key})))
|
||||
(is (not (some #{key} (applied-migration-ids db))))
|
||||
(add-migration-id db key)
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db)))))
|
||||
|
||||
|
||||
(deftest test-remove-migration-id
|
||||
(let [coll "meta.migrations"
|
||||
key "1"]
|
||||
(mc/remove db coll {})
|
||||
(add-migration-id db key)
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db)))
|
||||
(remove-migration-id db key)
|
||||
(is (not (some #{key} (applied-migration-ids db))))))
|
||||
|
||||
|
||||
(deftest test-migrations-ordering
|
||||
(let [coll "meta.migrations"
|
||||
all-keys [ "9" "4" "7" "1" "5" "3" "6" "2" "8"]]
|
||||
(mc/remove db coll {})
|
||||
|
||||
(doseq [key all-keys]
|
||||
(add-migration-id db key))
|
||||
|
||||
(doseq [key all-keys]
|
||||
(is (mc/any? db coll {:_id key}))
|
||||
(is (some #{key} (applied-migration-ids db))))
|
||||
|
||||
(testing "Applied migrations must come out in creation order"
|
||||
(is (= all-keys (applied-migration-ids db))))))))
|
||||
266
test/monger/test/regular_finders.clj
Normal file
266
test/monger/test/regular_finders.clj
Normal file
|
|
@ -0,0 +1,266 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.regular-finders
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
;;
|
||||
;; find-one
|
||||
;;
|
||||
|
||||
(deftest find-one-full-document-when-collection-is-empty
|
||||
(let [collection "docs"]
|
||||
(is (nil? (mgcol/find-one collection {})))))
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-is-empty
|
||||
(let [collection "docs"]
|
||||
(is (nil? (mgcol/find-one-as-map collection {})))))
|
||||
|
||||
|
||||
(deftest find-one-full-document-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mgcol/insert collection doc)
|
||||
found-one (mgcol/find-one collection { :language "Clojure" })]
|
||||
(is (= (:_id doc) (monger.util/get-id found-one)))
|
||||
(is (= (mgcnv/from-db-object found-one true) doc))
|
||||
(is (= (mgcnv/to-db-object doc) found-one))))
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= doc (mgcol/find-one-as-map collection { :language "Clojure" })))))
|
||||
|
||||
|
||||
|
||||
(deftest find-one-partial-document-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:language]
|
||||
_ (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-one collection { :language "Clojure" } fields)]
|
||||
(is (nil? (.get ^DBObject loaded "data-store")))
|
||||
(is (= doc-id (monger.util/get-id loaded)))
|
||||
(is (= "Clojure" (.get ^DBObject loaded "language")))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= { :data-store "MongoDB", :_id doc-id } (mgcol/find-one-as-map collection { :language "Clojure" } fields)))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-one-as-map collection { :language "Clojure" } fields true)
|
||||
]
|
||||
(is (= { :data-store "MongoDB", :_id doc-id } loaded ))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize-false
|
||||
(let [collection "docs"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-one-as-map collection { :language "Clojure" } fields false)
|
||||
]
|
||||
(is (= { "_id" doc-id, "data-store" "MongoDB" } loaded ))))
|
||||
|
||||
;;
|
||||
;; find-by-id
|
||||
;;
|
||||
|
||||
(deftest find-full-document-by-string-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)]
|
||||
(is (nil? (mgcol/find-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-string-id-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException (mgcol/find-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-object-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)]
|
||||
(is (nil? (mgcol/find-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-id-as-map-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)]
|
||||
(is (nil? (mgcol/find-map-by-id collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-id-as-map-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException
|
||||
(mgcol/find-map-by-id collection doc-id)))))
|
||||
|
||||
|
||||
(deftest find-full-document-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-full-document-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-full-document-map-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-map-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-full-document-map-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-map-by-id collection doc-id))))))
|
||||
|
||||
(deftest find-partial-document-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= ({ :language "Clojure" } (mgcol/find-by-id collection doc-id [ :language ]))))))
|
||||
|
||||
|
||||
(deftest find-partial-document-as-map-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
fields [:data-store]
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mgcol/insert collection doc)
|
||||
loaded (mgcol/find-map-by-id collection doc-id [ :language ])]
|
||||
(is (= { :language "Clojure", :_id doc-id } loaded ))
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
;;
|
||||
;; find
|
||||
;;
|
||||
|
||||
(deftest find-full-document-when-collection-is-empty
|
||||
(let [collection "docs"
|
||||
cursor (mgcol/find collection)]
|
||||
(is (empty? (iterator-seq cursor)))))
|
||||
|
||||
(deftest find-document-seq-when-collection-is-empty
|
||||
(let [collection "docs"]
|
||||
(is (empty? (mgcol/find-seq collection)))))
|
||||
|
||||
(deftest find-multiple-documents-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mgcol/find collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-maps-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mgcol/find-maps collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-documents-by-regex
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Java", :name "nhibernate" }
|
||||
{ :language "JavaScript", :name "sprout-core" }])
|
||||
(is (= 2 (monger.core/count (mgcol/find collection { :language #"Java*" }))))))
|
||||
|
||||
(deftest find-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (monger.core/count (mgcol/find collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mgcol/find collection { :language "Clojure" }))))
|
||||
(is (empty? (mgcol/find collection { :language "Java" })))))
|
||||
|
||||
|
||||
(deftest find-document-specify-fields
|
||||
(let [collection "libraries"
|
||||
_ (mgcol/insert collection { :language "Clojure", :name "monger" })
|
||||
result (mgcol/find collection { :language "Clojure"} [:language])]
|
||||
(is (= (seq [:_id :language]) (keys (mgcnv/from-db-object (.next result) true))))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents-the-hard-way
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (map (fn [dbo]
|
||||
(mgcnv/from-db-object dbo true))
|
||||
(mgcol/find-seq collection { :language "Clojure" })))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (mgcol/find-maps collection { :language "Clojure" }))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
|
||||
(deftest find-multiple-maps
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (clojure.core/count (mgcol/find-maps collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mgcol/find-maps collection { :language "Clojure" }))))
|
||||
(is (empty? (mgcol/find-maps collection { :language "Java" })))
|
||||
(is (empty? (mgcol/find-maps monger.core/*mongodb-database* collection { :language "Java" } [:language :name])))))
|
||||
|
||||
|
||||
|
||||
(deftest find-multiple-partial-documents
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert-batch collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(let [scala-libs (mgcol/find collection { :language "Scala" } [:name])
|
||||
clojure-libs (mgcol/find collection { :language "Clojure"} [:language])]
|
||||
(is (= 1 (.count scala-libs)))
|
||||
(is (= 3 (.count clojure-libs)))
|
||||
(doseq [i clojure-libs]
|
||||
(let [doc (mgcnv/from-db-object i true)]
|
||||
(is (= (:language doc) "Clojure"))))
|
||||
(is (empty? (mgcol/find collection { :language "Erlang" } [:name]))))))
|
||||
|
|
@ -1,293 +0,0 @@
|
|||
(ns monger.test.regular-finders-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.util :as mu]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer [to-db-object]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(use-fixtures :each (fn [f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "regular_finders_docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "regular_finders_docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")))
|
||||
|
||||
;;
|
||||
;; find-one
|
||||
;;
|
||||
|
||||
(deftest find-one-full-document-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"]
|
||||
(is (nil? (mc/find-one db collection {})))))
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"]
|
||||
(mc/remove db collection)
|
||||
(is (nil? (mc/find-one-as-map db collection {})))))
|
||||
|
||||
|
||||
(deftest find-one-full-document-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
found-one (mc/find-one db collection { :language "Clojure" })]
|
||||
(is found-one)
|
||||
(is (= (:_id doc) (mu/get-id found-one)))
|
||||
(is (= (mgcnv/from-db-object found-one true) doc))
|
||||
(is (= (mgcnv/to-db-object doc) found-one))))
|
||||
|
||||
(deftest find-one-full-document-as-map-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= doc (mc/find-one-as-map db collection { :language "Clojure" })))))
|
||||
|
||||
|
||||
|
||||
(deftest find-one-partial-document-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
loaded (mc/find-one db collection { :language "Clojure" } [:language])]
|
||||
(is (nil? (.get ^DBObject loaded "data-store")))
|
||||
(is (= doc-id (mu/get-id loaded)))
|
||||
(is (= "Clojure" (.get ^DBObject loaded "language")))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-using-field-negation-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
^DBObject loaded (mc/find-one db collection { :language "Clojure" } {:data-store 0 :_id 0})]
|
||||
(is (nil? (.get loaded "data-store")))
|
||||
(is (nil? (.get loaded "_id")))
|
||||
(is (nil? (mu/get-id loaded)))
|
||||
(is (= "Clojure" (.get loaded "language")))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= { :data-store "MongoDB", :_id doc-id }
|
||||
(mc/find-one-as-map db collection { :language "Clojure" } [:data-store])))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mc/insert db collection doc)
|
||||
loaded (mc/find-one-as-map db collection { :language "Clojure" } fields true)
|
||||
]
|
||||
(is (= { :data-store "MongoDB", :_id doc-id } loaded ))))
|
||||
|
||||
|
||||
(deftest find-one-partial-document-as-map-when-collection-has-matches-with-keywordize-false
|
||||
(let [collection "regular_finders_docs"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
fields [:data-store]
|
||||
_id (mc/insert db collection doc)
|
||||
loaded (mc/find-one-as-map db collection { :language "Clojure" } fields false)]
|
||||
(is (= { "_id" doc-id, "data-store" "MongoDB" } loaded ))))
|
||||
|
||||
;;
|
||||
;; find-by-id
|
||||
;;
|
||||
|
||||
(deftest find-full-document-by-string-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)]
|
||||
(is (nil? (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-string-id-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-object-id-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)]
|
||||
(is (nil? (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-id-as-map-when-that-document-does-not-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)]
|
||||
(is (nil? (mc/find-map-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-id-as-map-when-id-is-nil
|
||||
(let [collection "libraries"
|
||||
doc-id nil]
|
||||
(is (thrown? IllegalArgumentException
|
||||
(mc/find-map-by-id db collection doc-id)))))
|
||||
|
||||
|
||||
(deftest find-full-document-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-map-by-string-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= doc (mc/find-map-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-full-document-map-by-object-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= doc (mc/find-map-by-id db collection doc-id)))))
|
||||
|
||||
(deftest find-partial-document-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object { :_id doc-id :language "Clojure" })
|
||||
(mc/find-by-id db collection doc-id [ :language ])))))
|
||||
|
||||
|
||||
(deftest find-partial-document-as-map-by-id-when-document-does-exist
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
fields [:data-store]
|
||||
doc { :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
_ (mc/insert db collection doc)
|
||||
loaded (mc/find-map-by-id db collection doc-id [ :language ])]
|
||||
(is (= { :language "Clojure", :_id doc-id } loaded ))))
|
||||
|
||||
|
||||
;;
|
||||
;; find
|
||||
;;
|
||||
|
||||
(deftest find-full-document-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"
|
||||
cursor (mc/find db collection)]
|
||||
(is (empty? (iterator-seq cursor)))))
|
||||
|
||||
(deftest find-document-seq-when-collection-is-empty
|
||||
(let [collection "regular_finders_docs"]
|
||||
(is (empty? (mc/find-seq db collection)))))
|
||||
|
||||
(deftest find-multiple-documents-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mc/find db collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-maps-when-collection-is-empty
|
||||
(let [collection "libraries"]
|
||||
(is (empty? (mc/find-maps db collection { :language "Scala" })))))
|
||||
|
||||
(deftest find-multiple-documents-by-regex
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Java", :name "nhibernate" }
|
||||
{ :language "JavaScript", :name "sprout-core" }])
|
||||
(is (= 2 (monger.core/count (mc/find db collection { :language #"Java*" }))))))
|
||||
|
||||
(deftest find-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (monger.core/count (mc/find db collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mc/find db collection { :language "Clojure" }))))
|
||||
(is (empty? (mc/find db collection { :language "Java" })))))
|
||||
|
||||
|
||||
(deftest find-document-specify-fields
|
||||
(let [collection "libraries"
|
||||
_ (mc/insert db collection { :language "Clojure", :name "monger" })
|
||||
result (mc/find db collection { :language "Clojure"} [:language])]
|
||||
(is (= (set [:_id :language]) (-> (mgcnv/from-db-object (.next result) true) keys set)))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents-the-hard-way
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (map (fn [dbo]
|
||||
(mgcnv/from-db-object dbo true))
|
||||
(mc/find-seq db collection { :language "Clojure" })))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
(deftest find-and-iterate-over-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(doseq [doc (take 3 (mc/find-maps db collection { :language "Clojure" }))]
|
||||
(is (= "Clojure" (:language doc))))))
|
||||
|
||||
|
||||
(deftest find-multiple-maps
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 1 (clojure.core/count (mc/find-maps db collection { :language "Scala" }))))
|
||||
(is (= 3 (.count (mc/find-maps db collection { :language "Clojure" }))))
|
||||
(is (empty? (mc/find-maps db collection { :language "Java" })))
|
||||
(is (empty? (mc/find-maps db collection { :language "Java" } [:language :name])))))
|
||||
|
||||
|
||||
|
||||
(deftest find-multiple-partial-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(let [scala-libs (mc/find db collection { :language "Scala" } [:name])
|
||||
clojure-libs (mc/find db collection { :language "Clojure"} [:language])]
|
||||
(is (= 1 (.count scala-libs)))
|
||||
(is (= 3 (.count clojure-libs)))
|
||||
(doseq [i clojure-libs]
|
||||
(let [doc (mgcnv/from-db-object i true)]
|
||||
(is (= (:language doc) "Clojure"))))
|
||||
(is (empty? (mc/find db collection { :language "Erlang" } [:name]))))))
|
||||
|
||||
(deftest find-maps-with-keywordize-false
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }])
|
||||
(let [results (mc/find-maps db collection {:name "langohr"} [] false)]
|
||||
(is (= 1 (.count results)))
|
||||
(is (= (get (first results) "language") "Clojure"))))))
|
||||
54
test/monger/test/result.clj
Normal file
54
test/monger/test/result.clj
Normal file
|
|
@ -0,0 +1,54 @@
|
|||
(ns monger.test.result
|
||||
(:import (com.mongodb BasicDBObject WriteResult WriteConcern) (java.util Date))
|
||||
(:require [monger core collection conversion]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
;;
|
||||
;; MongoCommandResult
|
||||
;;
|
||||
|
||||
|
||||
(deftest test-ok?
|
||||
(let [result-that-is-not-ok-1 (doto (BasicDBObject.) (.put "ok" 0))
|
||||
result-that-is-not-ok-2 (doto (BasicDBObject.) (.put "ok" "false"))
|
||||
result-that-is-ok-1 (doto (BasicDBObject.) (.put "ok" 1))
|
||||
result-that-is-ok-2 (doto (BasicDBObject.) (.put "ok" "true"))
|
||||
result-that-is-ok-3 (doto (BasicDBObject.) (.put "ok" 1.0))]
|
||||
(is (not (monger.result/ok? result-that-is-not-ok-1)))
|
||||
(is (not (monger.result/ok? result-that-is-not-ok-2)))
|
||||
(is (monger.result/ok? result-that-is-ok-1))
|
||||
(is (monger.result/ok? result-that-is-ok-2))
|
||||
(is (monger.result/ok? result-that-is-ok-3))))
|
||||
|
||||
|
||||
(deftest test-has-error?
|
||||
(let [result-that-has-no-error1 (doto (BasicDBObject.) (.put "ok" 0))
|
||||
result-that-has-no-error2 (doto (BasicDBObject.) (.put "err" ""))
|
||||
result-that-has-error1 (doto (BasicDBObject.) (.put "err" (BasicDBObject.)))]
|
||||
(is (not (monger.result/has-error? result-that-has-no-error1)))
|
||||
(is (not (monger.result/has-error? result-that-has-no-error2)))
|
||||
(is (monger.result/has-error? result-that-has-error1))))
|
||||
|
||||
|
||||
(deftest test-updated-existing?-with-db-object
|
||||
(let [input1 (doto (BasicDBObject.) (.put "updatedExisting" true))
|
||||
input2 (doto (BasicDBObject.) (.put "updatedExisting" false))
|
||||
input3 (BasicDBObject.)]
|
||||
(is (monger.result/updated-existing? input1))
|
||||
(is (not (monger.result/updated-existing? input2)))
|
||||
(is (not (monger.result/updated-existing? input3)))))
|
||||
|
||||
(deftest test-updated-existing?-with-write-result
|
||||
(monger.collection/remove "libraries")
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(is (not (monger.result/updated-existing? (monger.collection/update collection { :language "Clojure" } doc :upsert true))))
|
||||
(is (monger.result/updated-existing? (monger.collection/update collection { :language "Clojure" } doc :upsert true)))
|
||||
(monger.result/updated-existing? (monger.collection/update collection { :language "Clojure" } modified-doc :multi false :upsert true))
|
||||
(monger.collection/remove collection)))
|
||||
|
|
@ -1,25 +0,0 @@
|
|||
(ns monger.test.result-test
|
||||
(:import [com.mongodb BasicDBObject WriteResult WriteConcern] java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.result :as mgres]
|
||||
monger.util
|
||||
[clojure.test :refer :all]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(deftest test-updated-existing?-with-write-result
|
||||
(mc/remove db "libraries")
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date :data-store "MongoDB" :language "Clojure" :_id doc-id }
|
||||
modified-doc { :created-at date :data-store "MongoDB" :language "Erlang" :_id doc-id }]
|
||||
(let [result (mc/update db collection { :language "Clojure" } doc {:upsert true})]
|
||||
(is (not (mgres/updated-existing? result)))
|
||||
(is (= 1 (mgres/affected-count result))))
|
||||
(is (mgres/updated-existing? (mc/update db collection { :language "Clojure" } doc {:upsert true})))
|
||||
(is (mgres/updated-existing? (mc/update db collection { :language "Clojure" } modified-doc {:multi false :upsert true})))
|
||||
(is (= 1 (mgres/affected-count (mc/remove db collection { :_id doc-id }))))
|
||||
(mc/remove db collection)
|
||||
(mg/disconnect conn))))
|
||||
|
|
@ -1,50 +0,0 @@
|
|||
(ns monger.test.ring.clojure-session-store-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[ring.middleware.session.store :refer :all]
|
||||
[monger.ring.session-store :refer :all]))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-sessions
|
||||
[f]
|
||||
(mc/remove db "sessions")
|
||||
(f)
|
||||
(mc/remove db "sessions"))
|
||||
|
||||
(use-fixtures :each purge-sessions)
|
||||
|
||||
|
||||
(deftest test-reading-a-session-that-does-not-exist
|
||||
(let [store (session-store db "sessions")]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
|
||||
|
||||
(deftest test-reading-a-session-that-does-exist
|
||||
(let [store (session-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m)))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Monger"}))))
|
||||
|
||||
|
||||
(deftest test-updating-a-session
|
||||
(let [store (session-store db "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id)
|
||||
{:library "Ring"}))))
|
||||
|
||||
(deftest test-deleting-a-session
|
||||
(let [store (session-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk))))))
|
||||
|
|
@ -1,54 +0,0 @@
|
|||
(ns monger.test.ring.session-store-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[clojure.test :refer :all]
|
||||
[ring.middleware.session.store :refer :all]
|
||||
[monger.ring.session-store :refer :all]))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-sessions
|
||||
[f]
|
||||
(mc/remove db "sessions")
|
||||
(f)
|
||||
(mc/remove db "sessions"))
|
||||
|
||||
(use-fixtures :each purge-sessions)
|
||||
|
||||
(deftest test-reading-a-session-that-does-not-exist
|
||||
(let [store (monger-store db "sessions")]
|
||||
(is (= {} (read-session store "a-missing-key-1228277")))))
|
||||
|
||||
(deftest test-reading-a-session-that-does-exist
|
||||
(let [store (monger-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})
|
||||
m (read-session store sk)]
|
||||
(is sk)
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Monger"}))))
|
||||
|
||||
(deftest test-updating-a-session
|
||||
(let [store (monger-store db "sessions")
|
||||
sk1 (write-session store nil {:library "Monger"})
|
||||
sk2 (write-session store sk1 {:library "Ring"})
|
||||
m (read-session store sk2)]
|
||||
(is (and sk1 sk2))
|
||||
(is (and (:_id m) (:date m)))
|
||||
(is (= sk1 sk2))
|
||||
(is (= (dissoc m :_id :date)
|
||||
{:library "Ring"}))))
|
||||
|
||||
(deftest test-deleting-a-session
|
||||
(let [store (monger-store db "sessions")
|
||||
sk (write-session store nil {:library "Monger"})]
|
||||
(is (nil? (delete-session store sk)))
|
||||
(is (= {} (read-session store sk)))))
|
||||
|
||||
(deftest test-reader-extensions
|
||||
(let [d (java.util.Date.)
|
||||
oid (org.bson.types.ObjectId.)]
|
||||
(binding [*print-dup* true]
|
||||
(pr-str d)
|
||||
(pr-str oid)))))
|
||||
42
test/monger/test/stress.clj
Normal file
42
test/monger/test/stress.clj
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
(ns monger.test.stress
|
||||
(:import [com.mongodb Mongo DB DBCollection WriteResult DBObject WriteConcern DBCursor]
|
||||
[java.util Date])
|
||||
(:require [monger core]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]))
|
||||
|
||||
|
||||
;;
|
||||
;; Fixture functions
|
||||
;;
|
||||
|
||||
(defn purge-collection
|
||||
[collection-name, f]
|
||||
(monger.collection/remove collection-name)
|
||||
(f)
|
||||
(monger.collection/remove collection-name))
|
||||
|
||||
(defn purge-things-collection
|
||||
[f]
|
||||
(purge-collection "things" f))
|
||||
|
||||
(use-fixtures :each purge-things-collection)
|
||||
|
||||
|
||||
|
||||
;;
|
||||
;; Tests
|
||||
;;
|
||||
|
||||
(monger.core/set-default-write-concern! WriteConcern/NORMAL)
|
||||
|
||||
(deftest insert-large-batches-of-documents-without-object-ids
|
||||
(doseq [n [1000 10000 100000]]
|
||||
(let [collection "things"
|
||||
docs (map (fn [i]
|
||||
(monger.conversion/to-db-object { :title "Untitled" :created-at (Date.) :number i }))
|
||||
(take n (iterate inc 1)))]
|
||||
(monger.collection/remove collection)
|
||||
(println "Inserting " n " documents...")
|
||||
(time (monger.collection/insert-batch collection docs))
|
||||
(is (= n (monger.collection/count collection))))))
|
||||
|
|
@ -1,40 +0,0 @@
|
|||
(ns monger.test.stress-test
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.conversion :refer [to-db-object from-db-object]]
|
||||
[clojure.test :refer :all])
|
||||
(:import [com.mongodb WriteConcern]
|
||||
java.util.Date))
|
||||
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collection
|
||||
[coll f]
|
||||
(mc/remove db coll)
|
||||
(f)
|
||||
(mc/remove db coll))
|
||||
|
||||
(defn purge-things-collection
|
||||
[f]
|
||||
(purge-collection "things" f))
|
||||
|
||||
(use-fixtures :each purge-things-collection)
|
||||
|
||||
(deftest ^{:performance true} insert-large-batches-of-documents-without-object-ids
|
||||
(doseq [n [10 100 1000 10000 20000]]
|
||||
(let [collection "things"
|
||||
docs (map (fn [i]
|
||||
(to-db-object { :title "Untitled" :created-at (Date.) :number i }))
|
||||
(take n (iterate inc 1)))]
|
||||
(mc/remove db collection)
|
||||
(println "Inserting " n " documents...")
|
||||
(time (mc/insert-batch db collection docs))
|
||||
(is (= n (mc/count db collection))))))
|
||||
|
||||
(deftest ^{:performance true} convert-large-number-of-dbojects-to-maps
|
||||
(doseq [n [10 100 1000 20000 40000]]
|
||||
(let [docs (map (fn [i]
|
||||
(to-db-object {:title "Untitled" :created-at (Date.) :number i}))
|
||||
(take n (iterate inc 1)))]
|
||||
(time (doall (map (fn [x] (from-db-object x true)) docs)))))))
|
||||
112
test/monger/test/updating.clj
Normal file
112
test/monger/test/updating.clj
Normal file
|
|
@ -0,0 +1,112 @@
|
|||
(set! *warn-on-reflection* true)
|
||||
|
||||
(ns monger.test.updating
|
||||
(:import [com.mongodb WriteResult WriteConcern DBCursor DBObject]
|
||||
[org.bson.types ObjectId]
|
||||
[java.util Date])
|
||||
(:require [monger core util]
|
||||
[monger.collection :as mgcol]
|
||||
[monger.result :as mgres]
|
||||
[monger.conversion :as mgcnv]
|
||||
[monger.test.helper :as helper])
|
||||
(:use [clojure.test]
|
||||
[monger.operators]
|
||||
[monger.test.fixtures]))
|
||||
|
||||
(helper/connect!)
|
||||
|
||||
(use-fixtures :each purge-people purge-docs purge-things purge-libraries)
|
||||
|
||||
|
||||
;;
|
||||
;; update, save
|
||||
;;
|
||||
|
||||
(deftest update-document-by-id-without-upsert
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-by-id collection doc-id))))
|
||||
(mgcol/update collection { :_id doc-id } { :language "Erlang" })
|
||||
(is (= (modified-doc (mgcol/find-by-id collection doc-id))))))
|
||||
|
||||
(deftest update-document-by-id-without-upsert-using-update-by-id
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mgcol/insert collection doc)
|
||||
(is (= (doc (mgcol/find-by-id collection doc-id))))
|
||||
(mgcol/update-by-id collection doc-id { :language "Erlang" })
|
||||
(is (= (modified-doc (mgcol/find-by-id collection doc-id))))))
|
||||
|
||||
|
||||
(deftest update-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mgcol/insert collection { :language "Clojure", :name "monger" })
|
||||
(mgcol/insert collection { :language "Clojure", :name "langohr" })
|
||||
(mgcol/insert collection { :language "Clojure", :name "incanter" })
|
||||
(mgcol/insert collection { :language "Scala", :name "akka" })
|
||||
(is (= 3 (mgcol/count collection { :language "Clojure" })))
|
||||
(is (= 1 (mgcol/count collection { :language "Scala" })))
|
||||
(is (= 0 (mgcol/count collection { :language "Python" })))
|
||||
(mgcol/update collection { :language "Clojure" } { $set { :language "Python" } } :multi true)
|
||||
(is (= 0 (mgcol/count collection { :language "Clojure" })))
|
||||
(is (= 1 (mgcol/count collection { :language "Scala" })))
|
||||
(is (= 3 (mgcol/count collection { :language "Python" })))))
|
||||
|
||||
|
||||
(deftest save-a-new-document
|
||||
(let [collection "people"
|
||||
document { :name "Joe", :age 30 }]
|
||||
(is (monger.result/ok? (mgcol/save "people" document)))
|
||||
(is (= 1 (mgcol/count collection)))))
|
||||
|
||||
|
||||
(deftest save-a-new-basic-db-object
|
||||
(let [collection "people"
|
||||
doc (mgcnv/to-db-object { :name "Joe", :age 30 })]
|
||||
(is (nil? (monger.util/get-id doc)))
|
||||
(mgcol/save monger.core/*mongodb-database* "people" doc WriteConcern/SAFE)
|
||||
(is (not (nil? (monger.util/get-id doc))))))
|
||||
|
||||
|
||||
|
||||
(deftest update-an-existing-document-using-save
|
||||
(let [collection "people"
|
||||
doc-id "people-1"
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (monger.result/ok? (mgcol/insert "people" document)))
|
||||
(is (= 1 (mgcol/count collection)))
|
||||
(mgcol/save collection { :_id doc-id, :name "Alan", :age 40 })
|
||||
(is (= 1 (mgcol/count collection { :name "Alan", :age 40 })))))
|
||||
|
||||
|
||||
(deftest set-an-attribute-on-existing-document-using-update
|
||||
(let [collection "people"
|
||||
doc-id (monger.util/object-id)
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (monger.result/ok? (mgcol/insert "people" document)))
|
||||
(is (= 1 (mgcol/count collection)))
|
||||
(is (= 0 (mgcol/count collection { :has_kids true })))
|
||||
(mgcol/update collection { :_id doc-id } { $set { :has_kids true } })
|
||||
(is (= 1 (mgcol/count collection { :has_kids true })))))
|
||||
|
||||
|
||||
|
||||
(deftest upsert-a-document
|
||||
(let [collection "libraries"
|
||||
doc-id (monger.util/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(is (not (monger.result/updated-existing? (mgcol/update collection { :language "Clojure" } doc :upsert true))))
|
||||
(is (= 1 (mgcol/count collection)))
|
||||
(is (monger.result/updated-existing? (mgcol/update collection { :language "Clojure" } modified-doc :multi false :upsert true)))
|
||||
(is (= 1 (mgcol/count collection)))
|
||||
(is (= (modified-doc (mgcol/find-by-id collection doc-id))))
|
||||
(mgcol/remove collection)))
|
||||
|
|
@ -1,169 +0,0 @@
|
|||
(ns monger.test.updating-test
|
||||
(:import [com.mongodb WriteResult WriteConcern DBObject]
|
||||
org.bson.types.ObjectId
|
||||
java.util.Date)
|
||||
(:require [monger.core :as mg]
|
||||
[monger.collection :as mc]
|
||||
[monger.util :as mu]
|
||||
[monger.result :as mr]
|
||||
[clojure.test :refer :all]
|
||||
[monger.operators :refer :all]
|
||||
[monger.conversion :refer [to-db-object]]))
|
||||
|
||||
(let [conn (mg/connect)
|
||||
db (mg/get-db conn "monger-test")]
|
||||
(defn purge-collections
|
||||
[f]
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries")
|
||||
(f)
|
||||
(mc/remove db "people")
|
||||
(mc/remove db "docs")
|
||||
(mc/remove db "things")
|
||||
(mc/remove db "libraries"))
|
||||
|
||||
(use-fixtures :each purge-collections)
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/update db collection { :_id doc-id } { $set { :language "Erlang" } })
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest ^{:updating true} update-document-by-id-without-upsert-using-update-by-id
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/update-by-id db collection doc-id { $set { :language "Erlang" } })
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
(deftest ^{:updating true} update-nested-document-fields-without-upsert-using-update-by-id
|
||||
(let [collection "libraries"
|
||||
doc-id (ObjectId.)
|
||||
date (Date.)
|
||||
doc { :created-at date :data-store "MongoDB" :language { :primary "Clojure" } :_id doc-id }
|
||||
modified-doc { :created-at date :data-store "MongoDB" :language { :primary "Erlang" } :_id doc-id }]
|
||||
(mc/insert db collection doc)
|
||||
(is (= (to-db-object doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/update-by-id db collection doc-id { $set { "language.primary" "Erlang" }})
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} update-multiple-documents
|
||||
(let [collection "libraries"]
|
||||
(mc/insert-batch db collection [{ :language "Clojure", :name "monger" }
|
||||
{ :language "Clojure", :name "langohr" }
|
||||
{ :language "Clojure", :name "incanter" }
|
||||
{ :language "Scala", :name "akka" }])
|
||||
(is (= 3 (mc/count db collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count db collection { :language "Scala" })))
|
||||
(is (= 0 (mc/count db collection { :language "Python" })))
|
||||
(mc/update db collection { :language "Clojure" } { $set { :language "Python" } } {:multi true})
|
||||
(is (= 0 (mc/count db collection { :language "Clojure" })))
|
||||
(is (= 1 (mc/count db collection { :language "Scala" })))
|
||||
(is (= 3 (mc/count db collection { :language "Python" })))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} save-a-new-document
|
||||
(let [collection "people"
|
||||
document {:name "Joe" :age 30}]
|
||||
(is (mc/save db "people" document))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
(deftest ^{:updating true} save-and-return-a-new-document
|
||||
(let [collection "people"
|
||||
document {:name "Joe" :age 30}
|
||||
returned (mc/save-and-return db "people" document)]
|
||||
(is (:_id returned))
|
||||
(is (= document (dissoc returned :_id)))
|
||||
(is (= 1 (mc/count db collection)))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} save-a-new-basic-db-object
|
||||
(let [collection "people"
|
||||
doc (to-db-object {:name "Joe" :age 30})]
|
||||
(is (nil? (mu/get-id doc)))
|
||||
(mc/save db "people" doc WriteConcern/SAFE)
|
||||
(is (not (nil? (mu/get-id doc))))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save
|
||||
(let [collection "people"
|
||||
doc-id "people-1"
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mc/insert db collection document))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(mc/save db collection { :_id doc-id, :name "Alan", :age 40 })
|
||||
(is (= 1 (mc/count db collection { :name "Alan", :age 40 })))))
|
||||
|
||||
(deftest ^{:updating true} update-an-existing-document-using-save-and-return
|
||||
(let [collection "people"
|
||||
document (mc/insert-and-return db collection {:name "Joe" :age 30})
|
||||
doc-id (:_id document)
|
||||
updated (mc/save-and-return db collection {:_id doc-id :name "Alan" :age 40})]
|
||||
(is (= {:_id doc-id :name "Alan" :age 40} updated))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= 1 (mc/count db collection {:name "Alan" :age 40})))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} set-an-attribute-on-existing-document-using-update
|
||||
(let [collection "people"
|
||||
doc-id (mu/object-id)
|
||||
document { :_id doc-id, :name "Joe", :age 30 }]
|
||||
(is (mc/insert db collection document))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= 0 (mc/count db collection { :has_kids true })))
|
||||
(mc/update db collection { :_id doc-id } { $set { :has_kids true } })
|
||||
(is (= 1 (mc/count db collection { :has_kids true })))))
|
||||
|
||||
|
||||
(deftest ^{:updating true} increment-multiple-fields-using-exists-operator-and-update
|
||||
(let [collection "matches"
|
||||
doc-id (mu/object-id)
|
||||
document { :_id doc-id :abc 0 :def 10 }]
|
||||
(mc/remove db collection)
|
||||
(is (mc/insert db collection document))
|
||||
(is (= 1 (mc/count db collection {:abc {$exists true} :def {$exists true}})))
|
||||
(mc/update db collection {:abc {$exists true} :def {$exists true}} {$inc {:abc 1 :def 0}})
|
||||
(is (= 1 (mc/count db collection { :abc 1 })))))
|
||||
|
||||
|
||||
|
||||
(deftest ^{:updating true} upsert-a-document-using-update
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc { :created-at date, :data-store "MongoDB", :language "Clojure", :_id doc-id }
|
||||
modified-doc { :created-at date, :data-store "MongoDB", :language "Erlang", :_id doc-id }]
|
||||
(is (not (mr/updated-existing? (mc/update db collection { :language "Clojure" } doc {:upsert true}))))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (mr/updated-existing? (mc/update db collection { :language "Clojure" } modified-doc {:multi false :upsert true})))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/remove db collection)))
|
||||
|
||||
(deftest ^{:updating true} upsert-a-document-using-upsert
|
||||
(let [collection "libraries"
|
||||
doc-id (mu/random-uuid)
|
||||
date (Date.)
|
||||
doc {:created-at date :data-store "MongoDB" :language "Clojure" :_id doc-id}
|
||||
modified-doc {:created-at date :data-store "MongoDB" :language "Erlang" :_id doc-id}]
|
||||
(mc/remove db collection)
|
||||
(is (not (mr/updated-existing? (mc/upsert db collection {:language "Clojure"} doc))))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (mr/updated-existing? (mc/upsert db collection {:language "Clojure"} modified-doc {:multi false})))
|
||||
(is (= 1 (mc/count db collection)))
|
||||
(is (= (to-db-object modified-doc) (mc/find-by-id db collection doc-id)))
|
||||
(mc/remove db collection))))
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
(ns monger.test.util-test
|
||||
(:import com.mongodb.DBObject)
|
||||
(:require [monger util conversion]
|
||||
[clojure.test :refer :all]))
|
||||
(ns monger.test.util
|
||||
(:import (com.mongodb DBObject))
|
||||
(:require [monger util conversion])
|
||||
(:use [clojure.test]))
|
||||
|
||||
|
||||
(deftest get-object-id
|
||||
|
|
@ -1,11 +0,0 @@
|
|||
<configuration>
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
<logger name="org.mongodb" level="WARN"/>
|
||||
<root level="DEBUG">
|
||||
<appender-ref ref="STDOUT"/>
|
||||
</root>
|
||||
</configuration>
|
||||
Loading…
Reference in a new issue