Updating mongodb node modules.

This commit is contained in:
Samuel Clay 2019-04-13 15:32:47 -04:00
parent 2e6ad3afda
commit 6c8097aa05
122 changed files with 28260 additions and 27317 deletions

64
node/node_modules/bson/HISTORY.md generated vendored
View file

@ -1,3 +1,67 @@
<a name="1.0.9"></a>
## [1.0.9](https://github.com/mongodb/js-bson/compare/v1.0.8...v1.0.9) (2018-06-07)
### Bug Fixes
* **serializer:** remove use of `const` ([5feb12f](https://github.com/mongodb/js-bson/commit/5feb12f))
<a name="1.0.7"></a>
## [1.0.7](https://github.com/mongodb/js-bson/compare/v1.0.6...v1.0.7) (2018-06-06)
### Bug Fixes
* **binary:** add type checking for buffer ([26b05b5](https://github.com/mongodb/js-bson/commit/26b05b5))
* **bson:** fix custom inspect property ([080323b](https://github.com/mongodb/js-bson/commit/080323b))
* **readme:** clarify documentation about deserialize methods ([20f764c](https://github.com/mongodb/js-bson/commit/20f764c))
* **serialization:** normalize function stringification ([1320c10](https://github.com/mongodb/js-bson/commit/1320c10))
<a name="1.0.6"></a>
## [1.0.6](https://github.com/mongodb/js-bson/compare/v1.0.5...v1.0.6) (2018-03-12)
### Features
* **serialization:** support arbitrary sizes for the internal serialization buffer ([abe97bc](https://github.com/mongodb/js-bson/commit/abe97bc))
<a name="1.0.5"></a>
## 1.0.5 (2018-02-26)
### Bug Fixes
* **decimal128:** add basic guard against REDOS attacks ([bd61c45](https://github.com/mongodb/js-bson/commit/bd61c45))
* **objectid:** if pid is 1, use random value ([e188ae6](https://github.com/mongodb/js-bson/commit/e188ae6))
1.0.4 2016-01-11
----------------
- #204 remove Buffer.from as it's partially broken in early 4.x.x. series of node releases.
1.0.3 2016-01-03
----------------
- Fixed toString for ObjectId so it will work with inspect.
1.0.2 2016-01-02
----------------
- Minor optimizations for ObjectID to use Buffer.from where available.
1.0.1 2016-12-06
----------------
- Reverse behavior for undefined to be serialized as NULL. MongoDB 3.4 does not allow for undefined comparisons.
1.0.0 2016-12-06
----------------
- Introduced new BSON API and documentation.
0.5.7 2016-11-18
-----------------
- NODE-848 BSON Regex flags must be alphabetically ordered.

167
node/node_modules/bson/README.md generated vendored
View file

@ -1,8 +1,19 @@
# BSON parser
If you don't yet know what BSON actually is, read [the spec](http://bsonspec.org).
BSON is short for Bin­ary JSON and is the bin­ary-en­coded seri­al­iz­a­tion of JSON-like doc­u­ments. You can learn more about it in [the specification](http://bsonspec.org).
This package can be used to serialize JSON documents into the BSON format or the other way around. If you want to use it within the browser, give [browserify](https://github.com/substack/node-browserify) a try (it will help you add this package to your bundle). The current build is located in the `browser_build/bson.js` file.
This browser version of the BSON parser is compiled using [webpack](https://webpack.js.org/) and the current version is pre-compiled in the `browser_build` directory.
This is the default BSON parser, however, there is a C++ Node.js addon version as well that does not support the browser. It can be found at [mongod-js/bson-ext](https://github.com/mongodb-js/bson-ext).
## Usage
To build a new version perform the following operations:
```
npm install
npm run build
```
A simple example of how to use BSON in the browser:
@ -11,85 +22,149 @@ A simple example of how to use BSON in the browser:
<script>
function start() {
var BSON = bson().BSON
var Long = bson().Long
// Get the Long type
var Long = BSON.Long;
// Create a bson parser instance
var bson = new BSON();
// Serialize document
var doc = { long: Long.fromNumber(100) }
// Serialize a document
var data = BSON.serialize(doc, false, true, false)
var data = bson.serialize(doc)
// De serialize it again
var doc_2 = BSON.deserialize(data)
var doc_2 = bson.deserialize(data)
}
</script>
```
A simple example of how to use BSON in `node.js`:
A simple example of how to use BSON in `Node.js`:
```js
var bson = require('bson')
var BSON = new bson.BSONPure.BSON()
var Long = bson.BSONPure.Long
// Get BSON parser class
var BSON = require('bson')
// Get the Long type
var Long = BSON.Long;
// Create a bson parser instance
var bson = new BSON();
// Serialize document
var doc = { long: Long.fromNumber(100) }
// Serialize a document
var data = BSON.serialize(doc, false, true, false)
var data = bson.serialize(doc)
console.log('data:', data)
// Deserialize the resulting Buffer
var doc_2 = BSON.deserialize(data)
var doc_2 = bson.deserialize(data)
console.log('doc_2:', doc_2)
```
## API
The API consists of two simple methods to serialize/deserialize objects to/from BSON format:
=======
## Installation
`npm install bson`
## API
### BSON types
For all BSON types documentation, please refer to the following sources:
* [MongoDB BSON Type Reference](https://docs.mongodb.com/manual/reference/bson-types/)
* [BSON Spec](https://bsonspec.org/)
### BSON serialization and deserialiation
**`new bson.BSONPure.BSON()`** - Creates a new BSON seralizer/deserializer you can use to serialize and deserialize BSON.
**`new BSON()`** - Creates a new BSON serializer/deserializer you can use to serialize and deserialize BSON.
* BSON.serialize(object, checkKeys, asBuffer, serializeFunctions)
* @param {Object} object the Javascript object to serialize.
* @param {Boolean} checkKeys the serializer will check if keys are valid.
* @param {Boolean} asBuffer return the serialized object as a Buffer object **(ignore)**.
* @param {Boolean} serializeFunctions serialize the javascript functions **(default:false)**
* @return {TypedArray/Array} returns a TypedArray or Array depending on what your browser supports
#### BSON.serialize
* BSON.deserialize(buffer, options, isArray)
* Options
* **evalFunctions** {Boolean, default:false}, evaluate functions in the BSON document scoped to the object deserialized.
* **cacheFunctions** {Boolean, default:false}, cache evaluated functions for reuse.
* **cacheFunctionsCrc32** {Boolean, default:false}, use a crc32 code for caching, otherwise use the string of the function.
* **promoteBuffers** {Boolean, default:false}, deserialize Binary data directly into node.js Buffer object.
* @param {TypedArray/Array} a TypedArray/Array containing the BSON data
* @param {Object} [options] additional options used for the deserialization.
* @param {Boolean} [isArray] ignore used for recursive parsing.
* @return {Object} returns the deserialized Javascript Object.
The BSON `serialize` method takes a JavaScript object and an optional options object and returns a Node.js Buffer.
### ObjectId
* `BSON.serialize(object, options)`
* @param {Object} object the JavaScript object to serialize.
* @param {Boolean} [options.checkKeys=false] the serializer will check if keys are valid.
* @param {Boolean} [options.serializeFunctions=false] serialize the JavaScript functions.
* @param {Boolean} [options.ignoreUndefined=true]
* @return {Buffer} returns a Buffer instance.
**`bson.ObjectId.isValid(id)`** - Returns true if `id` is a valid number or hexadecimal string representing an ObjectId.
**`bson.ObjectId.createFromHexString(hexString)`** - Returns the ObjectId the `hexString` represents.
**`bson.ObjectId.createFromTime(time)`** - Returns an ObjectId containing the passed time.
* `time` - A Unix timestamp (number of seconds since the epoch).
#### BSON.serializeWithBufferAndIndex
**`var objectId = new bson.ObjectId(id)`** - Creates a new `ObjectId`.
* `id` - Must either be a 24-character hex string or a 12 byte binary string.
The BSON `serializeWithBufferAndIndex` method takes an object, a target buffer instance and an optional options object and returns the end serialization index in the final buffer.
**`objectId.toJSON()`**
**`objectId.toString()`**
**`objectId.toHexString()`** - Returns a hexadecimal string representation of the ObjectId.
* `BSON.serializeWithBufferAndIndex(object, buffer, options)`
* @param {Object} object the JavaScript object to serialize.
* @param {Buffer} buffer the Buffer you pre-allocated to store the serialized BSON object.
* @param {Boolean} [options.checkKeys=false] the serializer will check if keys are valid.
* @param {Boolean} [options.serializeFunctions=false] serialize the JavaScript functions.
* @param {Boolean} [options.ignoreUndefined=true] ignore undefined fields.
* @param {Number} [options.index=0] the index in the buffer where we wish to start serializing into.
* @return {Number} returns the index pointing to the last written byte in the buffer.
**`objectId.equals(otherObjectId)`** - Returns true if the ObjectIds are the same, false otherwise.
#### BSON.calculateObjectSize
**`objectId.getTimestamp()`** - Returns a `Date` object containing the time the objectId was created for.
The BSON `calculateObjectSize` method takes a JavaScript object and an optional options object and returns the size of the BSON object.
**`objectId.getTimestamp()`** - Returns a `Date` object containing the time the objectId contains.
* `BSON.calculateObjectSize(object, options)`
* @param {Object} object the JavaScript object to serialize.
* @param {Boolean} [options.serializeFunctions=false] serialize the JavaScript functions.
* @param {Boolean} [options.ignoreUndefined=true]
* @return {Buffer} returns a Buffer instance.
#### BSON.deserialize
The BSON `deserialize` method takes a Node.js Buffer and an optional options object and returns a deserialized JavaScript object.
* `BSON.deserialize(buffer, options)`
* @param {Object} [options.evalFunctions=false] evaluate functions in the BSON document scoped to the object deserialized.
* @param {Object} [options.cacheFunctions=false] cache evaluated functions for reuse.
* @param {Object} [options.cacheFunctionsCrc32=false] use a crc32 code for caching, otherwise use the string of the function.
* @param {Object} [options.promoteLongs=true] when deserializing a Long will fit it into a Number if it's smaller than 53 bits
* @param {Object} [options.promoteBuffers=false] when deserializing a Binary will return it as a Node.js Buffer instance.
* @param {Object} [options.promoteValues=false] when deserializing will promote BSON values to their Node.js closest equivalent types.
* @param {Object} [options.fieldsAsRaw=null] allow to specify if there what fields we wish to return as unserialized raw buffer.
* @param {Object} [options.bsonRegExp=false] return BSON regular expressions as BSONRegExp instances.
* @return {Object} returns the deserialized Javascript Object.
#### BSON.deserializeStream
The BSON `deserializeStream` method takes a Node.js Buffer, `startIndex` and allow more control over deserialization of a Buffer containing concatenated BSON documents.
* `BSON.deserializeStream(buffer, startIndex, numberOfDocuments, documents, docStartIndex, options)`
* @param {Buffer} buffer the buffer containing the serialized set of BSON documents.
* @param {Number} startIndex the start index in the data Buffer where the deserialization is to start.
* @param {Number} numberOfDocuments number of documents to deserialize.
* @param {Array} documents an array where to store the deserialized documents.
* @param {Number} docStartIndex the index in the documents array from where to start inserting documents.
* @param {Object} [options.evalFunctions=false] evaluate functions in the BSON document scoped to the object deserialized.
* @param {Object} [options.cacheFunctions=false] cache evaluated functions for reuse.
* @param {Object} [options.cacheFunctionsCrc32=false] use a crc32 code for caching, otherwise use the string of the function.
* @param {Object} [options.promoteLongs=true] when deserializing a Long will fit it into a Number if it's smaller than 53 bits
* @param {Object} [options.promoteBuffers=false] when deserializing a Binary will return it as a Node.js Buffer instance.
* @param {Object} [options.promoteValues=false] when deserializing will promote BSON values to their Node.js closest equivalent types.
* @param {Object} [options.fieldsAsRaw=null] allow to specify if there what fields we wish to return as unserialized raw buffer.
* @param {Object} [options.bsonRegExp=false] return BSON regular expressions as BSONRegExp instances.
* @return {Number} returns the next index in the buffer after deserialization **x** numbers of documents.
## FAQ
#### Why does `undefined` get converted to `null`?
The `undefined` BSON type has been [deprecated for many years](http://bsonspec.org/spec.html), so this library has dropped support for it. Use the `ignoreUndefined` option (for example, from the [driver](http://mongodb.github.io/node-mongodb-native/2.2/api/MongoClient.html#connect) ) to instead remove `undefined` keys.
#### How do I add custom serialization logic?
This library looks for `toBSON()` functions on every path, and calls the `toBSON()` function to get the value to serialize.
```javascript
var bson = new BSON();
class CustomSerialize {
toBSON() {
return 42;
}
}
const obj = { answer: new CustomSerialize() };
// "{ answer: 42 }"
console.log(bson.deserialize(bson.serialize(obj)));
```

File diff suppressed because it is too large Load diff

View file

@ -1,429 +0,0 @@
/// reduced to ~ 410 LOCs (parser only 300 vs. 1400+) with (some, needed) BSON classes "inlined".
/// Compare ~ 4,300 (22KB vs. 157KB) in browser build at: https://github.com/mongodb/js-bson/blob/master/browser_build/bson.js
module.exports.calculateObjectSize = calculateObjectSize;
function calculateObjectSize(object) {
var totalLength = (4 + 1); /// handles the obj.length prefix + terminating '0' ?!
for(var key in object) { /// looks like it handles arrays under the same for...in loop!?
totalLength += calculateElement(key, object[key])
}
return totalLength;
}
function calculateElement(name, value) {
var len = 1; /// always starting with 1 for the data type byte!
if (name) len += Buffer.byteLength(name, 'utf8') + 1; /// cstring: name + '0' termination
if (value === undefined || value === null) return len; /// just the type byte plus name cstring
switch( value.constructor ) { /// removed all checks 'isBuffer' if Node.js Buffer class is present!?
case ObjectID: /// we want these sorted from most common case to least common/deprecated;
return len + 12;
case String:
return len + 4 + Buffer.byteLength(value, 'utf8') +1; ///
case Number:
if (Math.floor(value) === value) { /// case: integer; pos.# more common, '&&' stops if 1st fails!
if ( value <= 2147483647 && value >= -2147483647 ) // 32 bit
return len + 4;
else return len + 8; /// covers Long-ish JS integers as Longs!
} else return len + 8; /// 8+1 --- covers Double & std. float
case Boolean:
return len + 1;
case Array:
case Object:
return len + calculateObjectSize(value);
case Buffer: /// replaces the entire Binary class!
return len + 4 + value.length + 1;
case Regex: /// these are handled as strings by serializeFast() later, hence 'gim' opts = 3 + 1 chars
return len + Buffer.byteLength(value.source, 'utf8') + 1
+ (value.global ? 1 : 0) + (value.ignoreCase ? 1 : 0) + (value.multiline ? 1 : 0) +1;
case Date:
case Long:
case Timestamp:
case Double:
return len + 8;
case MinKey:
case MaxKey:
return len; /// these two return the type byte and name cstring only!
}
return 0;
}
module.exports.serializeFast = serializeFast;
module.exports.serialize = function(object, checkKeys, asBuffer, serializeFunctions, index) {
var buffer = new Buffer(calculateObjectSize(object));
return serializeFast(object, checkKeys, buffer, 0);
}
function serializeFast(object, checkKeys, buffer, i) { /// set checkKeys = false in query(..., options object to save performance IFF you're certain your keys are safe/system-set!
var size = buffer.length;
buffer[i++] = size & 0xff; buffer[i++] = (size >> 8) & 0xff; /// these get overwritten later!
buffer[i++] = (size >> 16) & 0xff; buffer[i++] = (size >> 24) & 0xff;
if (object.constructor === Array) { /// any need to checkKeys here?!? since we're doing for rather than for...in, should be safe from extra (non-numeric) keys added to the array?!
for(var j = 0; j < object.length; j++) {
i = packElement(j.toString(), object[j], checkKeys, buffer, i);
}
} else { /// checkKeys is needed if any suspicion of end-user key tampering/"injection" (a la SQL)
for(var key in object) { /// mostly there should never be direct access to them!?
if (checkKeys && (key.indexOf('\x00') >= 0 || key === '$where') ) { /// = "no script"?!; could add back key.indexOf('$') or maybe check for 'eval'?!
/// took out: || key.indexOf('.') >= 0... Don't we allow dot notation queries?!
console.log('checkKeys error: ');
return new Error('Illegal object key!');
}
i = packElement(key, object[key], checkKeys, buffer, i); /// checkKeys pass needed for recursion!
}
}
buffer[i++] = 0; /// write terminating zero; !we do NOT -1 the index increase here as original does!
return i;
}
function packElement(name, value, checkKeys, buffer, i) { /// serializeFunctions removed! checkKeys needed for Array & Object cases pass through (calling serializeFast recursively!)
if (value === undefined || value === null){
buffer[i++] = 10; /// = BSON.BSON_DATA_NULL;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0; /// buffer.write(...) returns bytesWritten!
return i;
}
switch(value.constructor) {
case ObjectID:
buffer[i++] = 7; /// = BSON.BSON_DATA_OID;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
/// i += buffer.write(value.id, i, 'binary'); /// OLD: writes a String to a Buffer; 'binary' deprecated!!
value.id.copy(buffer, i); /// NEW ObjectID version has this.id = Buffer at the ready!
return i += 12;
case String:
buffer[i++] = 2; /// = BSON.BSON_DATA_STRING;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
var size = Buffer.byteLength(value) + 1; /// includes the terminating '0'!?
buffer[i++] = size & 0xff; buffer[i++] = (size >> 8) & 0xff;
buffer[i++] = (size >> 16) & 0xff; buffer[i++] = (size >> 24) & 0xff;
i += buffer.write(value, i, 'utf8'); buffer[i++] = 0;
return i;
case Number:
if ( ~~(value) === value) { /// double-Tilde is equiv. to Math.floor(value)
if ( value <= 2147483647 && value >= -2147483647){ /// = BSON.BSON_INT32_MAX / MIN asf.
buffer[i++] = 16; /// = BSON.BSON_DATA_INT;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
buffer[i++] = value & 0xff; buffer[i++] = (value >> 8) & 0xff;
buffer[i++] = (value >> 16) & 0xff; buffer[i++] = (value >> 24) & 0xff;
// Else large-ish JS int!? to Long!?
} else { /// if (value <= BSON.JS_INT_MAX && value >= BSON.JS_INT_MIN){ /// 9007199254740992 asf.
buffer[i++] = 18; /// = BSON.BSON_DATA_LONG;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
var lowBits = ( value % 4294967296 ) | 0, highBits = ( value / 4294967296 ) | 0;
buffer[i++] = lowBits & 0xff; buffer[i++] = (lowBits >> 8) & 0xff;
buffer[i++] = (lowBits >> 16) & 0xff; buffer[i++] = (lowBits >> 24) & 0xff;
buffer[i++] = highBits & 0xff; buffer[i++] = (highBits >> 8) & 0xff;
buffer[i++] = (highBits >> 16) & 0xff; buffer[i++] = (highBits >> 24) & 0xff;
}
} else { /// we have a float / Double
buffer[i++] = 1; /// = BSON.BSON_DATA_NUMBER;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
/// OLD: writeIEEE754(buffer, value, i, 'little', 52, 8);
buffer.writeDoubleLE(value, i); i += 8;
}
return i;
case Boolean:
buffer[i++] = 8; /// = BSON.BSON_DATA_BOOLEAN;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
buffer[i++] = value ? 1 : 0;
return i;
case Array:
case Object:
buffer[i++] = value.constructor === Array ? 4 : 3; /// = BSON.BSON_DATA_ARRAY / _OBJECT;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
var endIndex = serializeFast(value, checkKeys, buffer, i); /// + 4); no longer needed b/c serializeFast writes a temp 4 bytes for length
var size = endIndex - i;
buffer[i++] = size & 0xff; buffer[i++] = (size >> 8) & 0xff;
buffer[i++] = (size >> 16) & 0xff; buffer[i++] = (size >> 24) & 0xff;
return endIndex;
/// case Binary: /// is basically identical unless special/deprecated options!
case Buffer: /// solves ALL of our Binary needs without the BSON.Binary class!?
buffer[i++] = 5; /// = BSON.BSON_DATA_BINARY;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
var size = value.length;
buffer[i++] = size & 0xff; buffer[i++] = (size >> 8) & 0xff;
buffer[i++] = (size >> 16) & 0xff; buffer[i++] = (size >> 24) & 0xff;
buffer[i++] = 0; /// write BSON.BSON_BINARY_SUBTYPE_DEFAULT;
value.copy(buffer, i); ///, 0, size); << defaults to sourceStart=0, sourceEnd=sourceBuffer.length);
i += size;
return i;
case RegExp:
buffer[i++] = 11; /// = BSON.BSON_DATA_REGEXP;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
i += buffer.write(value.source, i, 'utf8'); buffer[i++] = 0x00;
if (value.global) buffer[i++] = 0x73; // s = 'g' for JS Regex!
if (value.ignoreCase) buffer[i++] = 0x69; // i
if (value.multiline) buffer[i++] = 0x6d; // m
buffer[i++] = 0x00;
return i;
case Date:
buffer[i++] = 9; /// = BSON.BSON_DATA_DATE;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
var millis = value.getTime();
var lowBits = ( millis % 4294967296 ) | 0, highBits = ( millis / 4294967296 ) | 0;
buffer[i++] = lowBits & 0xff; buffer[i++] = (lowBits >> 8) & 0xff;
buffer[i++] = (lowBits >> 16) & 0xff; buffer[i++] = (lowBits >> 24) & 0xff;
buffer[i++] = highBits & 0xff; buffer[i++] = (highBits >> 8) & 0xff;
buffer[i++] = (highBits >> 16) & 0xff; buffer[i++] = (highBits >> 24) & 0xff;
return i;
case Long:
case Timestamp:
buffer[i++] = value.constructor === Long ? 18 : 17; /// = BSON.BSON_DATA_LONG / _TIMESTAMP
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
var lowBits = value.getLowBits(), highBits = value.getHighBits();
buffer[i++] = lowBits & 0xff; buffer[i++] = (lowBits >> 8) & 0xff;
buffer[i++] = (lowBits >> 16) & 0xff; buffer[i++] = (lowBits >> 24) & 0xff;
buffer[i++] = highBits & 0xff; buffer[i++] = (highBits >> 8) & 0xff;
buffer[i++] = (highBits >> 16) & 0xff; buffer[i++] = (highBits >> 24) & 0xff;
return i;
case Double:
buffer[i++] = 1; /// = BSON.BSON_DATA_NUMBER;
i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
/// OLD: writeIEEE754(buffer, value, i, 'little', 52, 8); i += 8;
buffer.writeDoubleLE(value, i); i += 8;
return i
case MinKey: /// = BSON.BSON_DATA_MINKEY;
buffer[i++] = 127; i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
return i;
case MaxKey: /// = BSON.BSON_DATA_MAXKEY;
buffer[i++] = 255; i += buffer.write(name, i, 'utf8'); buffer[i++] = 0;
return i;
} /// end of switch
return i; /// ?! If no value to serialize
}
module.exports.deserializeFast = deserializeFast;
function deserializeFast(buffer, i, isArray){ //// , options, isArray) { //// no more options!
if (buffer.length < 5) return new Error('Corrupt bson message < 5 bytes long'); /// from 'throw'
var elementType, tempindex = 0, name;
var string, low, high; /// = lowBits / highBits
/// using 'i' as the index to keep the lines shorter:
i || ( i = 0 ); /// for parseResponse it's 0; set to running index in deserialize(object/array) recursion
var object = isArray ? [] : {}; /// needed for type ARRAY recursion later!
var size = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
if(size < 5 || size > buffer.length) return new Error('Corrupt BSON message');
/// 'size' var was not used by anything after this, so we can reuse it
while(true) { // While we have more left data left keep parsing
elementType = buffer[i++]; // Read the type
if (elementType === 0) break; // If we get a zero it's the last byte, exit
tempindex = i; /// inlined readCStyleString & removed extra i<buffer.length check slowing EACH loop!
while( buffer[tempindex] !== 0x00 ) tempindex++; /// read ahead w/out changing main 'i' index
if (tempindex >= buffer.length) return new Error('Corrupt BSON document: illegal CString')
name = buffer.toString('utf8', i, tempindex);
i = tempindex + 1; /// Update index position to after the string + '0' termination
switch(elementType) {
case 7: /// = BSON.BSON_DATA_OID:
var buf = new Buffer(12);
buffer.copy(buf, 0, i, i += 12 ); /// copy 12 bytes from the current 'i' offset into fresh Buffer
object[name] = new ObjectID(buf); ///... & attach to the new ObjectID instance
break;
case 2: /// = BSON.BSON_DATA_STRING:
size = buffer[i++] | buffer[i++] <<8 | buffer[i++] <<16 | buffer[i++] <<24;
object[name] = buffer.toString('utf8', i, i += size -1 );
i++; break; /// need to get the '0' index "tick-forward" back!
case 16: /// = BSON.BSON_DATA_INT: // Decode the 32bit value
object[name] = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24; break;
case 1: /// = BSON.BSON_DATA_NUMBER: // Decode the double value
object[name] = buffer.readDoubleLE(i); /// slightly faster depending on dec.points; a LOT cleaner
/// OLD: object[name] = readIEEE754(buffer, i, 'little', 52, 8);
i += 8; break;
case 8: /// = BSON.BSON_DATA_BOOLEAN:
object[name] = buffer[i++] == 1; break;
case 6: /// = BSON.BSON_DATA_UNDEFINED: /// deprecated
case 10: /// = BSON.BSON_DATA_NULL:
object[name] = null; break;
case 4: /// = BSON.BSON_DATA_ARRAY
size = buffer[i] | buffer[i+1] <<8 | buffer[i+2] <<16 | buffer[i+3] <<24; /// NO 'i' increment since the size bytes are reread during the recursion!
object[name] = deserializeFast(buffer, i, true ); /// pass current index & set isArray = true
i += size; break;
case 3: /// = BSON.BSON_DATA_OBJECT:
size = buffer[i] | buffer[i+1] <<8 | buffer[i+2] <<16 | buffer[i+3] <<24;
object[name] = deserializeFast(buffer, i, false ); /// isArray = false => Object
i += size; break;
case 5: /// = BSON.BSON_DATA_BINARY: // Decode the size of the binary blob
size = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
buffer[i++]; /// Skip, as we assume always default subtype, i.e. 0!
object[name] = buffer.slice(i, i += size); /// creates a new Buffer "slice" view of the same memory!
break;
case 9: /// = BSON.BSON_DATA_DATE: /// SEE notes below on the Date type vs. other options...
low = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
high = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
object[name] = new Date( high * 4294967296 + (low < 0 ? low + 4294967296 : low) ); break;
case 18: /// = BSON.BSON_DATA_LONG: /// usage should be somewhat rare beyond parseResponse() -> cursorId, where it is handled inline, NOT as part of deserializeFast(returnedObjects); get lowBits, highBits:
low = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
high = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
size = high * 4294967296 + (low < 0 ? low + 4294967296 : low); /// from long.toNumber()
if (size < JS_INT_MAX && size > JS_INT_MIN) object[name] = size; /// positive # more likely!
else object[name] = new Long(low, high); break;
case 127: /// = BSON.BSON_DATA_MIN_KEY: /// do we EVER actually get these BACK from MongoDB server?!
object[name] = new MinKey(); break;
case 255: /// = BSON.BSON_DATA_MAX_KEY:
object[name] = new MaxKey(); break;
case 17: /// = BSON.BSON_DATA_TIMESTAMP: /// somewhat obscure internal BSON type; MongoDB uses it for (pseudo) high-res time timestamp (past millisecs precision is just a counter!) in the Oplog ts: field, etc.
low = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
high = buffer[i++] | buffer[i++] << 8 | buffer[i++] << 16 | buffer[i++] << 24;
object[name] = new Timestamp(low, high); break;
/// case 11: /// = RegExp is skipped; we should NEVER be getting any from the MongoDB server!?
} /// end of switch(elementType)
} /// end of while(1)
return object; // Return the finalized object
}
function MinKey() { this._bsontype = 'MinKey'; } /// these are merely placeholders/stubs to signify the type!?
function MaxKey() { this._bsontype = 'MaxKey'; }
function Long(low, high) {
this._bsontype = 'Long';
this.low_ = low | 0; this.high_ = high | 0; /// force into 32 signed bits.
}
Long.prototype.getLowBits = function(){ return this.low_; }
Long.prototype.getHighBits = function(){ return this.high_; }
Long.prototype.toNumber = function(){
return this.high_ * 4294967296 + (this.low_ < 0 ? this.low_ + 4294967296 : this.low_);
}
Long.fromNumber = function(num){
return new Long(num % 4294967296, num / 4294967296); /// |0 is forced in the constructor!
}
function Double(value) {
this._bsontype = 'Double';
this.value = value;
}
function Timestamp(low, high) {
this._bsontype = 'Timestamp';
this.low_ = low | 0; this.high_ = high | 0; /// force into 32 signed bits.
}
Timestamp.prototype.getLowBits = function(){ return this.low_; }
Timestamp.prototype.getHighBits = function(){ return this.high_; }
/////////////////////////////// ObjectID /////////////////////////////////
/// machine & proc IDs stored as 1 string, b/c Buffer shouldn't be held for long periods (could use SlowBuffer?!)
var MACHINE = parseInt(Math.random() * 0xFFFFFF, 10);
var PROCESS = process.pid % 0xFFFF;
var MACHINE_AND_PROC = encodeIntBE(MACHINE, 3) + encodeIntBE(PROCESS, 2); /// keep as ONE string, ready to go.
function encodeIntBE(data, bytes){ /// encode the bytes to a string
var result = '';
if (bytes >= 4){ result += String.fromCharCode(Math.floor(data / 0x1000000)); data %= 0x1000000; }
if (bytes >= 3){ result += String.fromCharCode(Math.floor(data / 0x10000)); data %= 0x10000; }
if (bytes >= 2){ result += String.fromCharCode(Math.floor(data / 0x100)); data %= 0x100; }
result += String.fromCharCode(Math.floor(data));
return result;
}
var _counter = ~~(Math.random() * 0xFFFFFF); /// double-tilde is equivalent to Math.floor()
var checkForHex = new RegExp('^[0-9a-fA-F]{24}$');
function ObjectID(id) {
this._bsontype = 'ObjectID';
if (!id){ this.id = createFromScratch(); /// base case, DONE.
} else {
if (id.constructor === Buffer){
this.id = id; /// case of
} else if (id.constructor === String) {
if ( id.length === 24 && checkForHex.test(id) ) {
this.id = new Buffer(id, 'hex');
} else {
this.id = new Error('Illegal/faulty Hexadecimal string supplied!'); /// changed from 'throw'
}
} else if (id.constructor === Number) {
this.id = createFromTime(id); /// this is what should be the only interface for this!?
}
}
}
function createFromScratch() {
var buf = new Buffer(12), i = 0;
var ts = ~~(Date.now()/1000); /// 4 bytes timestamp in seconds, BigEndian notation!
buf[i++] = (ts >> 24) & 0xFF; buf[i++] = (ts >> 16) & 0xFF;
buf[i++] = (ts >> 8) & 0xFF; buf[i++] = (ts) & 0xFF;
buf.write(MACHINE_AND_PROC, i, 5, 'utf8'); i += 5; /// write 3 bytes + 2 bytes MACHINE_ID and PROCESS_ID
_counter = ++_counter % 0xFFFFFF; /// 3 bytes internal _counter for subsecond resolution; BigEndian
buf[i++] = (_counter >> 16) & 0xFF;
buf[i++] = (_counter >> 8) & 0xFF;
buf[i++] = (_counter) & 0xFF;
return buf;
}
function createFromTime(ts) {
ts || ( ts = ~~(Date.now()/1000) ); /// 4 bytes timestamp in seconds only
var buf = new Buffer(12), i = 0;
buf[i++] = (ts >> 24) & 0xFF; buf[i++] = (ts >> 16) & 0xFF;
buf[i++] = (ts >> 8) & 0xFF; buf[i++] = (ts) & 0xFF;
for (;i < 12; ++i) buf[i] = 0x00; /// indeces 4 through 11 (8 bytes) get filled up with nulls
return buf;
}
ObjectID.prototype.toHexString = function toHexString() {
return this.id.toString('hex');
}
ObjectID.prototype.getTimestamp = function getTimestamp() {
return this.id.readUIntBE(0, 4);
}
ObjectID.prototype.getTimestampDate = function getTimestampDate() {
var ts = new Date();
ts.setTime(this.id.readUIntBE(0, 4) * 1000);
return ts;
}
ObjectID.createPk = function createPk () { ///?override if a PrivateKey factory w/ unique factors is warranted?!
return new ObjectID();
}
ObjectID.prototype.toJSON = function toJSON() {
return "ObjectID('" +this.id.toString('hex')+ "')";
}
/// module.exports.BSON = BSON; /// not needed anymore!? exports.Binary = Binary;
module.exports.ObjectID = ObjectID;
module.exports.MinKey = MinKey;
module.exports.MaxKey = MaxKey;
module.exports.Long = Long; /// ?! we really don't want to do the complicated Long math anywhere for now!?
//module.exports.Double = Double;
//module.exports.Timestamp = Timestamp;

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,6 @@
{ "name" : "bson"
, "description" : "A bson parser for node.js and the browser"
, "main": "../lib/bson/bson"
, "main": "../"
, "directories" : { "lib" : "../lib/bson" }
, "engines" : { "node" : ">=0.6.0" }
, "licenses" : [ { "type" : "Apache License, Version 2.0"

View file

@ -1,357 +0,0 @@
"use strict"
var writeIEEE754 = require('../float_parser').writeIEEE754,
readIEEE754 = require('../float_parser').readIEEE754,
f = require('util').format,
Long = require('../long').Long,
Double = require('../double').Double,
Timestamp = require('../timestamp').Timestamp,
ObjectID = require('../objectid').ObjectID,
Symbol = require('../symbol').Symbol,
Code = require('../code').Code,
MinKey = require('../min_key').MinKey,
MaxKey = require('../max_key').MaxKey,
DBRef = require('../db_ref').DBRef,
BSONRegExp = require('../regexp').BSONRegExp,
Binary = require('../binary').Binary;
var BSON = {};
/**
* Contains the function cache if we have that enable to allow for avoiding the eval step on each deserialization, comparison is by md5
*
* @ignore
* @api private
*/
var functionCache = BSON.functionCache = {};
/**
* Number BSON Type
*
* @classconstant BSON_DATA_NUMBER
**/
BSON.BSON_DATA_NUMBER = 1;
/**
* String BSON Type
*
* @classconstant BSON_DATA_STRING
**/
BSON.BSON_DATA_STRING = 2;
/**
* Object BSON Type
*
* @classconstant BSON_DATA_OBJECT
**/
BSON.BSON_DATA_OBJECT = 3;
/**
* Array BSON Type
*
* @classconstant BSON_DATA_ARRAY
**/
BSON.BSON_DATA_ARRAY = 4;
/**
* Binary BSON Type
*
* @classconstant BSON_DATA_BINARY
**/
BSON.BSON_DATA_BINARY = 5;
/**
* ObjectID BSON Type
*
* @classconstant BSON_DATA_OID
**/
BSON.BSON_DATA_OID = 7;
/**
* Boolean BSON Type
*
* @classconstant BSON_DATA_BOOLEAN
**/
BSON.BSON_DATA_BOOLEAN = 8;
/**
* Date BSON Type
*
* @classconstant BSON_DATA_DATE
**/
BSON.BSON_DATA_DATE = 9;
/**
* null BSON Type
*
* @classconstant BSON_DATA_NULL
**/
BSON.BSON_DATA_NULL = 10;
/**
* RegExp BSON Type
*
* @classconstant BSON_DATA_REGEXP
**/
BSON.BSON_DATA_REGEXP = 11;
/**
* Code BSON Type
*
* @classconstant BSON_DATA_CODE
**/
BSON.BSON_DATA_CODE = 13;
/**
* Symbol BSON Type
*
* @classconstant BSON_DATA_SYMBOL
**/
BSON.BSON_DATA_SYMBOL = 14;
/**
* Code with Scope BSON Type
*
* @classconstant BSON_DATA_CODE_W_SCOPE
**/
BSON.BSON_DATA_CODE_W_SCOPE = 15;
/**
* 32 bit Integer BSON Type
*
* @classconstant BSON_DATA_INT
**/
BSON.BSON_DATA_INT = 16;
/**
* Timestamp BSON Type
*
* @classconstant BSON_DATA_TIMESTAMP
**/
BSON.BSON_DATA_TIMESTAMP = 17;
/**
* Long BSON Type
*
* @classconstant BSON_DATA_LONG
**/
BSON.BSON_DATA_LONG = 18;
/**
* MinKey BSON Type
*
* @classconstant BSON_DATA_MIN_KEY
**/
BSON.BSON_DATA_MIN_KEY = 0xff;
/**
* MaxKey BSON Type
*
* @classconstant BSON_DATA_MAX_KEY
**/
BSON.BSON_DATA_MAX_KEY = 0x7f;
/**
* Binary Default Type
*
* @classconstant BSON_BINARY_SUBTYPE_DEFAULT
**/
BSON.BSON_BINARY_SUBTYPE_DEFAULT = 0;
/**
* Binary Function Type
*
* @classconstant BSON_BINARY_SUBTYPE_FUNCTION
**/
BSON.BSON_BINARY_SUBTYPE_FUNCTION = 1;
/**
* Binary Byte Array Type
*
* @classconstant BSON_BINARY_SUBTYPE_BYTE_ARRAY
**/
BSON.BSON_BINARY_SUBTYPE_BYTE_ARRAY = 2;
/**
* Binary UUID Type
*
* @classconstant BSON_BINARY_SUBTYPE_UUID
**/
BSON.BSON_BINARY_SUBTYPE_UUID = 3;
/**
* Binary MD5 Type
*
* @classconstant BSON_BINARY_SUBTYPE_MD5
**/
BSON.BSON_BINARY_SUBTYPE_MD5 = 4;
/**
* Binary User Defined Type
*
* @classconstant BSON_BINARY_SUBTYPE_USER_DEFINED
**/
BSON.BSON_BINARY_SUBTYPE_USER_DEFINED = 128;
// BSON MAX VALUES
BSON.BSON_INT32_MAX = 0x7FFFFFFF;
BSON.BSON_INT32_MIN = -0x80000000;
BSON.BSON_INT64_MAX = Math.pow(2, 63) - 1;
BSON.BSON_INT64_MIN = -Math.pow(2, 63);
// JS MAX PRECISE VALUES
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
// Internal long versions
var JS_INT_MAX_LONG = Long.fromNumber(0x20000000000000); // Any integer up to 2^53 can be precisely represented by a double.
var JS_INT_MIN_LONG = Long.fromNumber(-0x20000000000000); // Any integer down to -2^53 can be precisely represented by a double.
var deserialize = function(buffer, options, isArray) {
var index = 0;
// Read the document size
var size = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
// Ensure buffer is valid size
if(size < 5 || buffer.length < size) {
throw new Error("corrupt bson message");
}
// Illegal end value
if(buffer[size - 1] != 0) {
throw new Error("One object, sized correctly, with a spot for an EOO, but the EOO isn't 0x00");
}
// Start deserializtion
return deserializeObject(buffer, options, isArray);
}
// // Reads in a C style string
// var readCStyleStringSpecial = function(buffer, index) {
// // Get the start search index
// var i = index;
// // Locate the end of the c string
// while(buffer[i] !== 0x00 && i < buffer.length) {
// i++
// }
// // If are at the end of the buffer there is a problem with the document
// if(i >= buffer.length) throw new Error("Bad BSON Document: illegal CString")
// // Grab utf8 encoded string
// var string = buffer.toString('utf8', index, i);
// // Update index position
// index = i + 1;
// // Return string
// return {s: string, i: index};
// }
// Reads in a C style string
var readCStyleStringSpecial = function(buffer, index) {
// Get the start search index
var i = index;
// Locate the end of the c string
while(buffer[i] !== 0x00 && i < buffer.length) {
i++
}
// If are at the end of the buffer there is a problem with the document
if(i >= buffer.length) throw new Error("Bad BSON Document: illegal CString")
// Grab utf8 encoded string
return buffer.toString('utf8', index, i);
}
var DeserializationMethods = {}
DeserializationMethods[BSON.BSON_DATA_OID] = function(name, object, buffer, index) {
var string = buffer.toString('binary', index, index + 12);
object[name] = new ObjectID(string);
return index + 12;
}
DeserializationMethods[BSON.BSON_DATA_NUMBER] = function(name, object, buffer, index) {
object[name] = buffer.readDoubleLE(index);
return index + 8;
}
DeserializationMethods[BSON.BSON_DATA_INT] = function(name, object, buffer, index) {
object[name] = buffer.readInt32LE(index);
return index + 4;
}
DeserializationMethods[BSON.BSON_DATA_TIMESTAMP] = function(name, object, buffer, index) {
var lowBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
var highBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
object[name] = new Timestamp(lowBits, highBits);
return index;
}
DeserializationMethods[BSON.BSON_DATA_STRING] = function(name, object, buffer, index) {
var stringSize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
if(stringSize <= 0 || stringSize > (buffer.length - index) || buffer[index + stringSize - 1] != 0) throw new Error("bad string length in bson");
object[name] = buffer.toString('utf8', index, index + stringSize - 1);
return index + stringSize;
}
DeserializationMethods[BSON.BSON_DATA_BOOLEAN] = function(name, object, buffer, index) {
object[name] = buffer[index++] == 1;
return index;
}
var deserializeObject = function(buffer, options, isArray) {
// Options
options = options == null ? {} : options;
var evalFunctions = options['evalFunctions'] == null ? false : options['evalFunctions'];
var cacheFunctions = options['cacheFunctions'] == null ? false : options['cacheFunctions'];
var cacheFunctionsCrc32 = options['cacheFunctionsCrc32'] == null ? false : options['cacheFunctionsCrc32'];
var promoteLongs = options['promoteLongs'] == null ? true : options['promoteLongs'];
var fieldsAsRaw = options['fieldsAsRaw'] == null ? {} : options['fieldsAsRaw'];
// Return BSONRegExp objects instead of native regular expressions
var bsonRegExp = typeof options['bsonRegExp'] == 'boolean' ? options['bsonRegExp'] : false;
var promoteBuffers = options['promoteBuffers'] == null ? false : options['promoteBuffers'];
// Validate that we have at least 4 bytes of buffer
if(buffer.length < 5) throw new Error("corrupt bson message < 5 bytes long");
// Set up index
var index = typeof options['index'] == 'number' ? options['index'] : 0;
// Read the document size
var size = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
// Ensure buffer is valid size
if(size < 5 || size > buffer.length) throw new Error("corrupt bson message");
// Create holding object
var object = isArray ? [] : {};
// While we have more left data left keep parsing
while(true) {
// Read the type
var elementType = buffer[index++];
// If we get a zero it's the last byte, exit
if(elementType == 0) break;
var name = readCStyleStringSpecial(buffer, index);
index = index + name.length + 1;
// console.log("----------- 0 " + elementType + " :: " + name)
index = DeserializationMethods[elementType](name, object, buffer, index);
// console.log('--------------- 1')
}
// Check if we have a db ref object
if(object['$id'] != null) object = new DBRef(object['$ref'], object['$id'], object['$db']);
// Return the final objects
return object;
}
/**
* Ensure eval is isolated.
*
* @ignore
* @api private
*/
var isolateEvalWithHash = function(functionCache, hash, functionString, object) {
// Contains the value we are going to set
var value = null;
// Check for cache hit, eval if missing and return cached function
if(functionCache[hash] == null) {
eval("value = " + functionString);
functionCache[hash] = value;
}
// Set the object
return functionCache[hash].bind(object);
}
/**
* Ensure eval is isolated.
*
* @ignore
* @api private
*/
var isolateEval = function(functionString) {
// Contains the value we are going to set
var value = null;
// Eval the function
eval("value = " + functionString);
return value;
}
module.exports = deserialize

46
node/node_modules/bson/index.js generated vendored Normal file
View file

@ -0,0 +1,46 @@
var BSON = require('./lib/bson/bson'),
Binary = require('./lib/bson/binary'),
Code = require('./lib/bson/code'),
DBRef = require('./lib/bson/db_ref'),
Decimal128 = require('./lib/bson/decimal128'),
Double = require('./lib/bson/double'),
Int32 = require('./lib/bson/int_32'),
Long = require('./lib/bson/long'),
Map = require('./lib/bson/map'),
MaxKey = require('./lib/bson/max_key'),
MinKey = require('./lib/bson/min_key'),
ObjectId = require('./lib/bson/objectid'),
BSONRegExp = require('./lib/bson/regexp'),
Symbol = require('./lib/bson/symbol'),
Timestamp = require('./lib/bson/timestamp');
// BSON MAX VALUES
BSON.BSON_INT32_MAX = 0x7fffffff;
BSON.BSON_INT32_MIN = -0x80000000;
BSON.BSON_INT64_MAX = Math.pow(2, 63) - 1;
BSON.BSON_INT64_MIN = -Math.pow(2, 63);
// JS MAX PRECISE VALUES
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
// Add BSON types to function creation
BSON.Binary = Binary;
BSON.Code = Code;
BSON.DBRef = DBRef;
BSON.Decimal128 = Decimal128;
BSON.Double = Double;
BSON.Int32 = Int32;
BSON.Long = Long;
BSON.Map = Map;
BSON.MaxKey = MaxKey;
BSON.MinKey = MinKey;
BSON.ObjectId = ObjectId;
BSON.ObjectID = ObjectId;
BSON.BSONRegExp = BSONRegExp;
BSON.Symbol = Symbol;
BSON.Timestamp = Timestamp;
// Return the BSON
module.exports = BSON;

View file

@ -5,7 +5,7 @@
// Test if we're in Node via presence of "global" not absence of "window"
// to support hybrid environments like Electron
if(typeof global !== 'undefined') {
if (typeof global !== 'undefined') {
var Buffer = require('buffer').Buffer; // TODO just use global Buffer
}
@ -26,11 +26,21 @@ if(typeof global !== 'undefined') {
* @return {Binary}
*/
function Binary(buffer, subType) {
if(!(this instanceof Binary)) return new Binary(buffer, subType);
if (!(this instanceof Binary)) return new Binary(buffer, subType);
if (
buffer != null &&
!(typeof buffer === 'string') &&
!Buffer.isBuffer(buffer) &&
!(buffer instanceof Uint8Array) &&
!Array.isArray(buffer)
) {
throw new Error('only String, Buffer, Uint8Array or Array accepted');
}
this._bsontype = 'Binary';
if(buffer instanceof Number) {
if (buffer instanceof Number) {
this.sub_type = buffer;
this.position = 0;
} else {
@ -38,25 +48,28 @@ function Binary(buffer, subType) {
this.position = 0;
}
if(buffer != null && !(buffer instanceof Number)) {
if (buffer != null && !(buffer instanceof Number)) {
// Only accept Buffer, Uint8Array or Arrays
if(typeof buffer == 'string') {
if (typeof buffer === 'string') {
// Different ways of writing the length of the string for the different types
if(typeof Buffer != 'undefined') {
if (typeof Buffer !== 'undefined') {
this.buffer = new Buffer(buffer);
} else if(typeof Uint8Array != 'undefined' || (Object.prototype.toString.call(buffer) == '[object Array]')) {
} else if (
typeof Uint8Array !== 'undefined' ||
Object.prototype.toString.call(buffer) === '[object Array]'
) {
this.buffer = writeStringToArray(buffer);
} else {
throw new Error("only String, Buffer, Uint8Array or Array accepted");
throw new Error('only String, Buffer, Uint8Array or Array accepted');
}
} else {
this.buffer = buffer;
}
this.position = buffer.length;
} else {
if(typeof Buffer != 'undefined') {
this.buffer = new Buffer(Binary.BUFFER_SIZE);
} else if(typeof Uint8Array != 'undefined'){
if (typeof Buffer !== 'undefined') {
this.buffer = new Buffer(Binary.BUFFER_SIZE);
} else if (typeof Uint8Array !== 'undefined') {
this.buffer = new Uint8Array(new ArrayBuffer(Binary.BUFFER_SIZE));
} else {
this.buffer = new Array(Binary.BUFFER_SIZE);
@ -64,7 +77,7 @@ function Binary(buffer, subType) {
// Set position to start of buffer
this.position = 0;
}
};
}
/**
* Updates this binary with byte_value.
@ -74,23 +87,25 @@ function Binary(buffer, subType) {
*/
Binary.prototype.put = function put(byte_value) {
// If it's a string and a has more than one character throw an error
if(byte_value['length'] != null && typeof byte_value != 'number' && byte_value.length != 1) throw new Error("only accepts single character String, Uint8Array or Array");
if(typeof byte_value != 'number' && byte_value < 0 || byte_value > 255) throw new Error("only accepts number in a valid unsigned byte range 0-255");
if (byte_value['length'] != null && typeof byte_value !== 'number' && byte_value.length !== 1)
throw new Error('only accepts single character String, Uint8Array or Array');
if ((typeof byte_value !== 'number' && byte_value < 0) || byte_value > 255)
throw new Error('only accepts number in a valid unsigned byte range 0-255');
// Decode the byte value once
var decoded_byte = null;
if(typeof byte_value == 'string') {
if (typeof byte_value === 'string') {
decoded_byte = byte_value.charCodeAt(0);
} else if(byte_value['length'] != null) {
} else if (byte_value['length'] != null) {
decoded_byte = byte_value[0];
} else {
decoded_byte = byte_value;
}
if(this.buffer.length > this.position) {
if (this.buffer.length > this.position) {
this.buffer[this.position++] = decoded_byte;
} else {
if(typeof Buffer != 'undefined' && Buffer.isBuffer(this.buffer)) {
if (typeof Buffer !== 'undefined' && Buffer.isBuffer(this.buffer)) {
// Create additional overflow buffer
var buffer = new Buffer(Binary.BUFFER_SIZE + this.buffer.length);
// Combine the two buffers together
@ -98,16 +113,16 @@ Binary.prototype.put = function put(byte_value) {
this.buffer = buffer;
this.buffer[this.position++] = decoded_byte;
} else {
var buffer = null;
buffer = null;
// Create a new buffer (typed or normal array)
if(Object.prototype.toString.call(this.buffer) == '[object Uint8Array]') {
if (Object.prototype.toString.call(this.buffer) === '[object Uint8Array]') {
buffer = new Uint8Array(new ArrayBuffer(Binary.BUFFER_SIZE + this.buffer.length));
} else {
buffer = new Array(Binary.BUFFER_SIZE + this.buffer.length);
}
// We need to copy all the content to the new array
for(var i = 0; i < this.buffer.length; i++) {
for (var i = 0; i < this.buffer.length; i++) {
buffer[i] = this.buffer[i];
}
@ -128,20 +143,20 @@ Binary.prototype.put = function put(byte_value) {
* @return {null}
*/
Binary.prototype.write = function write(string, offset) {
offset = typeof offset == 'number' ? offset : this.position;
offset = typeof offset === 'number' ? offset : this.position;
// If the buffer is to small let's extend the buffer
if(this.buffer.length < offset + string.length) {
if (this.buffer.length < offset + string.length) {
var buffer = null;
// If we are in node.js
if(typeof Buffer != 'undefined' && Buffer.isBuffer(this.buffer)) {
if (typeof Buffer !== 'undefined' && Buffer.isBuffer(this.buffer)) {
buffer = new Buffer(this.buffer.length + string.length);
this.buffer.copy(buffer, 0, 0, this.buffer.length);
} else if(Object.prototype.toString.call(this.buffer) == '[object Uint8Array]') {
} else if (Object.prototype.toString.call(this.buffer) === '[object Uint8Array]') {
// Create a new buffer
buffer = new Uint8Array(new ArrayBuffer(this.buffer.length + string.length))
buffer = new Uint8Array(new ArrayBuffer(this.buffer.length + string.length));
// Copy the content
for(var i = 0; i < this.position; i++) {
for (var i = 0; i < this.position; i++) {
buffer[i] = this.buffer[i];
}
}
@ -150,23 +165,29 @@ Binary.prototype.write = function write(string, offset) {
this.buffer = buffer;
}
if(typeof Buffer != 'undefined' && Buffer.isBuffer(string) && Buffer.isBuffer(this.buffer)) {
if (typeof Buffer !== 'undefined' && Buffer.isBuffer(string) && Buffer.isBuffer(this.buffer)) {
string.copy(this.buffer, offset, 0, string.length);
this.position = (offset + string.length) > this.position ? (offset + string.length) : this.position;
this.position = offset + string.length > this.position ? offset + string.length : this.position;
// offset = string.length
} else if(typeof Buffer != 'undefined' && typeof string == 'string' && Buffer.isBuffer(this.buffer)) {
} else if (
typeof Buffer !== 'undefined' &&
typeof string === 'string' &&
Buffer.isBuffer(this.buffer)
) {
this.buffer.write(string, offset, 'binary');
this.position = (offset + string.length) > this.position ? (offset + string.length) : this.position;
this.position = offset + string.length > this.position ? offset + string.length : this.position;
// offset = string.length;
} else if(Object.prototype.toString.call(string) == '[object Uint8Array]'
|| Object.prototype.toString.call(string) == '[object Array]' && typeof string != 'string') {
for(var i = 0; i < string.length; i++) {
} else if (
Object.prototype.toString.call(string) === '[object Uint8Array]' ||
(Object.prototype.toString.call(string) === '[object Array]' && typeof string !== 'string')
) {
for (i = 0; i < string.length; i++) {
this.buffer[offset++] = string[i];
}
this.position = offset > this.position ? offset : this.position;
} else if(typeof string == 'string') {
for(var i = 0; i < string.length; i++) {
} else if (typeof string === 'string') {
for (i = 0; i < string.length; i++) {
this.buffer[offset++] = string.charCodeAt(i);
}
@ -183,17 +204,18 @@ Binary.prototype.write = function write(string, offset) {
* @return {Buffer}
*/
Binary.prototype.read = function read(position, length) {
length = length && length > 0
? length
: this.position;
length = length && length > 0 ? length : this.position;
// Let's return the data based on the type we have
if(this.buffer['slice']) {
if (this.buffer['slice']) {
return this.buffer.slice(position, position + length);
} else {
// Create a buffer to keep the result
var buffer = typeof Uint8Array != 'undefined' ? new Uint8Array(new ArrayBuffer(length)) : new Array(length);
for(var i = 0; i < length; i++) {
var buffer =
typeof Uint8Array !== 'undefined'
? new Uint8Array(new ArrayBuffer(length))
: new Array(length);
for (var i = 0; i < length; i++) {
buffer[i] = this.buffer[position++];
}
}
@ -211,22 +233,32 @@ Binary.prototype.value = function value(asRaw) {
asRaw = asRaw == null ? false : asRaw;
// Optimize to serialize for the situation where the data == size of buffer
if(asRaw && typeof Buffer != 'undefined' && Buffer.isBuffer(this.buffer) && this.buffer.length == this.position)
if (
asRaw &&
typeof Buffer !== 'undefined' &&
Buffer.isBuffer(this.buffer) &&
this.buffer.length === this.position
)
return this.buffer;
// If it's a node.js buffer object
if(typeof Buffer != 'undefined' && Buffer.isBuffer(this.buffer)) {
return asRaw ? this.buffer.slice(0, this.position) : this.buffer.toString('binary', 0, this.position);
if (typeof Buffer !== 'undefined' && Buffer.isBuffer(this.buffer)) {
return asRaw
? this.buffer.slice(0, this.position)
: this.buffer.toString('binary', 0, this.position);
} else {
if(asRaw) {
if (asRaw) {
// we support the slice command use it
if(this.buffer['slice'] != null) {
if (this.buffer['slice'] != null) {
return this.buffer.slice(0, this.position);
} else {
// Create a new buffer to copy content to
var newBuffer = Object.prototype.toString.call(this.buffer) == '[object Uint8Array]' ? new Uint8Array(new ArrayBuffer(this.position)) : new Array(this.position);
var newBuffer =
Object.prototype.toString.call(this.buffer) === '[object Uint8Array]'
? new Uint8Array(new ArrayBuffer(this.position))
: new Array(this.position);
// Copy content
for(var i = 0; i < this.position; i++) {
for (var i = 0; i < this.position; i++) {
newBuffer[i] = this.buffer[i];
}
// Return the buffer
@ -253,14 +285,14 @@ Binary.prototype.length = function length() {
*/
Binary.prototype.toJSON = function() {
return this.buffer != null ? this.buffer.toString('base64') : '';
}
};
/**
* @ignore
*/
Binary.prototype.toString = function(format) {
return this.buffer != null ? this.buffer.slice(0, this.position).toString(format) : '';
}
};
/**
* Binary default subtype
@ -273,14 +305,17 @@ var BSON_BINARY_SUBTYPE_DEFAULT = 0;
*/
var writeStringToArray = function(data) {
// Create a buffer
var buffer = typeof Uint8Array != 'undefined' ? new Uint8Array(new ArrayBuffer(data.length)) : new Array(data.length);
var buffer =
typeof Uint8Array !== 'undefined'
? new Uint8Array(new ArrayBuffer(data.length))
: new Array(data.length);
// Write the content to the buffer
for(var i = 0; i < data.length; i++) {
for (var i = 0; i < data.length; i++) {
buffer[i] = data.charCodeAt(i);
}
// Write the string to the buffer
return buffer;
}
};
/**
* Convert Array ot Uint8Array to Binary String
@ -288,9 +323,9 @@ var writeStringToArray = function(data) {
* @ignore
*/
var convertArraytoUtf8BinaryString = function(byteArray, startIndex, endIndex) {
var result = "";
for(var i = startIndex; i < endIndex; i++) {
result = result + String.fromCharCode(byteArray[i]);
var result = '';
for (var i = startIndex; i < endIndex; i++) {
result = result + String.fromCharCode(byteArray[i]);
}
return result;
};

View file

@ -1,17 +1,15 @@
"use strict"
'use strict';
var writeIEEE754 = require('./float_parser').writeIEEE754,
readIEEE754 = require('./float_parser').readIEEE754,
Map = require('./map'),
Long = require('./long'),
var Map = require('./map'),
Long = require('./long'),
Double = require('./double'),
Timestamp = require('./timestamp'),
ObjectID = require('./objectid'),
BSONRegExp = require('./regexp'),
Symbol = require('./symbol'),
Int32 = require('./int_32'),
Int32 = require('./int_32'),
Code = require('./code'),
Decimal128 = require('./decimal128'),
Decimal128 = require('./decimal128'),
MinKey = require('./min_key'),
MaxKey = require('./max_key'),
DBRef = require('./db_ref'),
@ -19,118 +17,178 @@ var writeIEEE754 = require('./float_parser').writeIEEE754,
// Parts of the parser
var deserialize = require('./parser/deserializer'),
serializer = require('./parser/serializer'),
calculateObjectSize = require('./parser/calculate_size');
serializer = require('./parser/serializer'),
calculateObjectSize = require('./parser/calculate_size');
/**
* @ignore
* @api private
*/
// Max Size
var MAXSIZE = (1024*1024*17);
// Max Document Buffer size
// Default Max Size
var MAXSIZE = 1024 * 1024 * 17;
// Current Internal Temporary Serialization Buffer
var buffer = new Buffer(MAXSIZE);
var BSON = function() {
}
var BSON = function() {};
/**
* Serialize a Javascript object.
*
* @param {Object} object the Javascript object to serialize.
* @param {Boolean} checkKeys the serializer will check if keys are valid.
* @param {Boolean} asBuffer return the serialized object as a Buffer object **(ignore)**.
* @param {Boolean} serializeFunctions serialize the javascript functions **(default:false)**.
* @param {Boolean} [options.checkKeys] the serializer will check if keys are valid.
* @param {Boolean} [options.serializeFunctions=false] serialize the javascript functions **(default:false)**.
* @param {Boolean} [options.ignoreUndefined=true] ignore undefined fields **(default:true)**.
* @param {Number} [options.minInternalBufferSize=1024*1024*17] minimum size of the internal temporary serialization buffer **(default:1024*1024*17)**.
* @return {Buffer} returns the Buffer object containing the serialized object.
* @api public
*/
BSON.prototype.serialize = function serialize(object, checkKeys, asBuffer, serializeFunctions, index, ignoreUndefined) {
// Attempt to serialize
var serializationIndex = serializer(buffer, object, checkKeys, index || 0, 0, serializeFunctions, ignoreUndefined, []);
// Create the final buffer
var finishedBuffer = new Buffer(serializationIndex);
// Copy into the finished buffer
buffer.copy(finishedBuffer, 0, 0, finishedBuffer.length);
// Return the buffer
return finishedBuffer;
}
BSON.prototype.serialize = function serialize(object, options) {
options = options || {};
// Unpack the options
var checkKeys = typeof options.checkKeys === 'boolean' ? options.checkKeys : false;
var serializeFunctions =
typeof options.serializeFunctions === 'boolean' ? options.serializeFunctions : false;
var ignoreUndefined =
typeof options.ignoreUndefined === 'boolean' ? options.ignoreUndefined : true;
var minInternalBufferSize =
typeof options.minInternalBufferSize === 'number' ? options.minInternalBufferSize : MAXSIZE;
// Resize the internal serialization buffer if needed
if (buffer.length < minInternalBufferSize) {
buffer = new Buffer(minInternalBufferSize);
}
// Attempt to serialize
var serializationIndex = serializer(
buffer,
object,
checkKeys,
0,
0,
serializeFunctions,
ignoreUndefined,
[]
);
// Create the final buffer
var finishedBuffer = new Buffer(serializationIndex);
// Copy into the finished buffer
buffer.copy(finishedBuffer, 0, 0, finishedBuffer.length);
// Return the buffer
return finishedBuffer;
};
/**
* Serialize a Javascript object using a predefined Buffer and index into the buffer, useful when pre-allocating the space for serialization.
*
* @param {Object} object the Javascript object to serialize.
* @param {Boolean} checkKeys the serializer will check if keys are valid.
* @param {Buffer} buffer the Buffer you pre-allocated to store the serialized BSON object.
* @param {Number} index the index in the buffer where we wish to start serializing into.
* @param {Boolean} serializeFunctions serialize the javascript functions **(default:false)**.
* @param {Boolean} [options.checkKeys] the serializer will check if keys are valid.
* @param {Boolean} [options.serializeFunctions=false] serialize the javascript functions **(default:false)**.
* @param {Boolean} [options.ignoreUndefined=true] ignore undefined fields **(default:true)**.
* @param {Number} [options.index] the index in the buffer where we wish to start serializing into.
* @return {Number} returns the index pointing to the last written byte in the buffer.
* @api public
*/
BSON.prototype.serializeWithBufferAndIndex = function(object, checkKeys, finalBuffer, startIndex, serializeFunctions, ignoreUndefined) {
// Attempt to serialize
var serializationIndex = serializer(buffer, object, checkKeys, startIndex || 0, 0, serializeFunctions, ignoreUndefined);
buffer.copy(finalBuffer, startIndex, 0, serializationIndex);
// Return the index
return serializationIndex - 1;
}
BSON.prototype.serializeWithBufferAndIndex = function(object, finalBuffer, options) {
options = options || {};
// Unpack the options
var checkKeys = typeof options.checkKeys === 'boolean' ? options.checkKeys : false;
var serializeFunctions =
typeof options.serializeFunctions === 'boolean' ? options.serializeFunctions : false;
var ignoreUndefined =
typeof options.ignoreUndefined === 'boolean' ? options.ignoreUndefined : true;
var startIndex = typeof options.index === 'number' ? options.index : 0;
// Attempt to serialize
var serializationIndex = serializer(
finalBuffer,
object,
checkKeys,
startIndex || 0,
0,
serializeFunctions,
ignoreUndefined
);
// Return the index
return serializationIndex - 1;
};
/**
* Deserialize data as BSON.
*
* Options
* - **evalFunctions** {Boolean, default:false}, evaluate functions in the BSON document scoped to the object deserialized.
* - **cacheFunctions** {Boolean, default:false}, cache evaluated functions for reuse.
* - **cacheFunctionsCrc32** {Boolean, default:false}, use a crc32 code for caching, otherwise use the string of the function.
* - **promoteLongs** {Boolean, default:true}, when deserializing a Long will fit it into a Number if it's smaller than 53 bits
*
* @param {Buffer} buffer the buffer containing the serialized set of BSON documents.
* @param {Object} [options] additional options used for the deserialization.
* @param {Boolean} [isArray] ignore used for recursive parsing.
* @param {Object} [options.evalFunctions=false] evaluate functions in the BSON document scoped to the object deserialized.
* @param {Object} [options.cacheFunctions=false] cache evaluated functions for reuse.
* @param {Object} [options.cacheFunctionsCrc32=false] use a crc32 code for caching, otherwise use the string of the function.
* @param {Object} [options.promoteLongs=true] when deserializing a Long will fit it into a Number if it's smaller than 53 bits
* @param {Object} [options.promoteBuffers=false] when deserializing a Binary will return it as a node.js Buffer instance.
* @param {Object} [options.promoteValues=false] when deserializing will promote BSON values to their Node.js closest equivalent types.
* @param {Object} [options.fieldsAsRaw=null] allow to specify if there what fields we wish to return as unserialized raw buffer.
* @param {Object} [options.bsonRegExp=false] return BSON regular expressions as BSONRegExp instances.
* @return {Object} returns the deserialized Javascript Object.
* @api public
*/
BSON.prototype.deserialize = function(data, options) {
return deserialize(data, options);
}
BSON.prototype.deserialize = function(buffer, options) {
return deserialize(buffer, options);
};
/**
* Calculate the bson size for a passed in Javascript object.
*
* @param {Object} object the Javascript object to calculate the BSON byte size for.
* @param {Boolean} [serializeFunctions] serialize all functions in the object **(default:false)**.
* @param {Boolean} [options.serializeFunctions=false] serialize the javascript functions **(default:false)**.
* @param {Boolean} [options.ignoreUndefined=true] ignore undefined fields **(default:true)**.
* @return {Number} returns the number of bytes the BSON object will take up.
* @api public
*/
BSON.prototype.calculateObjectSize = function(object, serializeFunctions, ignoreUndefined) {
BSON.prototype.calculateObjectSize = function(object, options) {
options = options || {};
var serializeFunctions =
typeof options.serializeFunctions === 'boolean' ? options.serializeFunctions : false;
var ignoreUndefined =
typeof options.ignoreUndefined === 'boolean' ? options.ignoreUndefined : true;
return calculateObjectSize(object, serializeFunctions, ignoreUndefined);
}
};
/**
* Deserialize stream data as BSON documents.
*
* Options
* - **evalFunctions** {Boolean, default:false}, evaluate functions in the BSON document scoped to the object deserialized.
* - **cacheFunctions** {Boolean, default:false}, cache evaluated functions for reuse.
* - **cacheFunctionsCrc32** {Boolean, default:false}, use a crc32 code for caching, otherwise use the string of the function.
* - **promoteLongs** {Boolean, default:true}, when deserializing a Long will fit it into a Number if it's smaller than 53 bits
*
* @param {Buffer} data the buffer containing the serialized set of BSON documents.
* @param {Number} startIndex the start index in the data Buffer where the deserialization is to start.
* @param {Number} numberOfDocuments number of documents to deserialize.
* @param {Array} documents an array where to store the deserialized documents.
* @param {Number} docStartIndex the index in the documents array from where to start inserting documents.
* @param {Object} [options] additional options used for the deserialization.
* @param {Object} [options.evalFunctions=false] evaluate functions in the BSON document scoped to the object deserialized.
* @param {Object} [options.cacheFunctions=false] cache evaluated functions for reuse.
* @param {Object} [options.cacheFunctionsCrc32=false] use a crc32 code for caching, otherwise use the string of the function.
* @param {Object} [options.promoteLongs=true] when deserializing a Long will fit it into a Number if it's smaller than 53 bits
* @param {Object} [options.promoteBuffers=false] when deserializing a Binary will return it as a node.js Buffer instance.
* @param {Object} [options.promoteValues=false] when deserializing will promote BSON values to their Node.js closest equivalent types.
* @param {Object} [options.fieldsAsRaw=null] allow to specify if there what fields we wish to return as unserialized raw buffer.
* @param {Object} [options.bsonRegExp=false] return BSON regular expressions as BSONRegExp instances.
* @return {Number} returns the next index in the buffer after deserialization **x** numbers of documents.
* @api public
*/
BSON.prototype.deserializeStream = function(data, startIndex, numberOfDocuments, documents, docStartIndex, options) {
// if(numberOfDocuments !== documents.length) throw new Error("Number of expected results back is less than the number of documents");
BSON.prototype.deserializeStream = function(
data,
startIndex,
numberOfDocuments,
documents,
docStartIndex,
options
) {
options = options != null ? options : {};
var index = startIndex;
// Loop over all documents
for(var i = 0; i < numberOfDocuments; i++) {
for (var i = 0; i < numberOfDocuments; i++) {
// Find size of the document
var size = data[index] | data[index + 1] << 8 | data[index + 2] << 16 | data[index + 3] << 24;
var size =
data[index] | (data[index + 1] << 8) | (data[index + 2] << 16) | (data[index + 3] << 24);
// Update options with index
options['index'] = index;
// Parse the document at this point
@ -141,26 +199,26 @@ BSON.prototype.deserializeStream = function(data, startIndex, numberOfDocuments,
// Return object containing end index of parsing and list of documents
return index;
}
};
/**
* @ignore
* @api private
*/
// BSON MAX VALUES
BSON.BSON_INT32_MAX = 0x7FFFFFFF;
BSON.BSON_INT32_MAX = 0x7fffffff;
BSON.BSON_INT32_MIN = -0x80000000;
BSON.BSON_INT64_MAX = Math.pow(2, 63) - 1;
BSON.BSON_INT64_MIN = -Math.pow(2, 63);
// JS MAX PRECISE VALUES
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
// Internal long versions
var JS_INT_MAX_LONG = Long.fromNumber(0x20000000000000); // Any integer up to 2^53 can be precisely represented by a double.
var JS_INT_MIN_LONG = Long.fromNumber(-0x20000000000000); // Any integer down to -2^53 can be precisely represented by a double.
// var JS_INT_MAX_LONG = Long.fromNumber(0x20000000000000); // Any integer up to 2^53 can be precisely represented by a double.
// var JS_INT_MIN_LONG = Long.fromNumber(-0x20000000000000); // Any integer down to -2^53 can be precisely represented by a double.
/**
* Number BSON Type

View file

@ -7,7 +7,7 @@
* @return {Code}
*/
var Code = function Code(code, scope) {
if(!(this instanceof Code)) return new Code(code, scope);
if (!(this instanceof Code)) return new Code(code, scope);
this._bsontype = 'Code';
this.code = code;
this.scope = scope;
@ -17,8 +17,8 @@ var Code = function Code(code, scope) {
* @ignore
*/
Code.prototype.toJSON = function() {
return {scope:this.scope, code:this.code};
}
return { scope: this.scope, code: this.code };
};
module.exports = Code;
module.exports.Code = Code;

View file

@ -8,13 +8,13 @@
* @return {DBRef}
*/
function DBRef(namespace, oid, db) {
if(!(this instanceof DBRef)) return new DBRef(namespace, oid, db);
if (!(this instanceof DBRef)) return new DBRef(namespace, oid, db);
this._bsontype = 'DBRef';
this.namespace = namespace;
this.oid = oid;
this.db = db;
};
}
/**
* @ignore
@ -22,11 +22,11 @@ function DBRef(namespace, oid, db) {
*/
DBRef.prototype.toJSON = function() {
return {
'$ref':this.namespace,
'$id':this.oid,
'$db':this.db == null ? '' : this.db
$ref: this.namespace,
$id: this.oid,
$db: this.db == null ? '' : this.db
};
}
};
module.exports = DBRef;
module.exports.DBRef = DBRef;
module.exports.DBRef = DBRef;

View file

@ -1,10 +1,10 @@
"use strict"
'use strict';
var Long = require('./long');
var PARSE_STRING_REGEXP = /^(\+|\-)?(\d+|(\d*\.\d*))?(E|e)?([\-\+])?(\d+)?$/;
var PARSE_INF_REGEXP = /^(\+|\-)?(Infinity|inf)$/i;
var PARSE_NAN_REGEXP = /^(\+|\-)?NaN$/i;
var PARSE_STRING_REGEXP = /^(\+|-)?(\d+|(\d*\.\d*))?(E|e)?([-+])?(\d+)?$/;
var PARSE_INF_REGEXP = /^(\+|-)?(Infinity|inf)$/i;
var PARSE_NAN_REGEXP = /^(\+|-)?NaN$/i;
var EXPONENT_MAX = 6111;
var EXPONENT_MIN = -6176;
@ -12,18 +12,68 @@ var EXPONENT_BIAS = 6176;
var MAX_DIGITS = 34;
// Nan value bits as 32 bit values (due to lack of longs)
var NAN_BUFFER = [0x7c, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00].reverse();
var NAN_BUFFER = [
0x7c,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00
].reverse();
// Infinity value bits 32 bit values (due to lack of longs)
var INF_NEGATIVE_BUFFER = [0xf8, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00].reverse();
var INF_POSITIVE_BUFFER = [0x78, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00].reverse();
var EXPONENT_REGEX = /^([\-\+])?(\d+)?$/;
var INF_NEGATIVE_BUFFER = [
0xf8,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00
].reverse();
var INF_POSITIVE_BUFFER = [
0x78,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00,
0x00
].reverse();
var EXPONENT_REGEX = /^([-+])?(\d+)?$/;
// Detect if the value is a digit
var isDigit = function(value) {
return !isNaN(parseInt(value, 10));
}
};
// Divide two uint128 values
var divideu128 = function(value) {
@ -31,12 +81,11 @@ var divideu128 = function(value) {
var _rem = Long.fromNumber(0);
var i = 0;
if(!value.parts[0] && !value.parts[1] &&
!value.parts[2] && !value.parts[3]) {
if (!value.parts[0] && !value.parts[1] && !value.parts[2] && !value.parts[3]) {
return { quotient: value, rem: _rem };
}
for(var i = 0; i <= 3; i++) {
for (i = 0; i <= 3; i++) {
// Adjust remainder to match value of next dividend
_rem = _rem.shiftLeft(32);
// Add the divided to _rem
@ -46,12 +95,12 @@ var divideu128 = function(value) {
}
return { quotient: value, rem: _rem };
}
};
// Multiply two Long values and return the 128 bit value
var multiply64x2 = function(left, right) {
if(!left && !right) {
return {high: Long.fromNumber(0), low: Long.fromNumber(0)};
if (!left && !right) {
return { high: Long.fromNumber(0), low: Long.fromNumber(0) };
}
var leftHigh = left.shiftRightUnsigned(32);
@ -66,15 +115,15 @@ var multiply64x2 = function(left, right) {
productHigh = productHigh.add(productMid.shiftRightUnsigned(32));
productMid = new Long(productMid.getLowBits(), 0)
.add(productMid2)
.add(productLow.shiftRightUnsigned(32));
.add(productMid2)
.add(productLow.shiftRightUnsigned(32));
productHigh = productHigh.add(productMid.shiftRightUnsigned(32));
productLow = productMid.shiftLeft(32).add(new Long(productLow.getLowBits(), 0));
// Return the 128 bit result
return {high: productHigh, low: productLow};
}
return { high: productHigh, low: productLow };
};
var lessThan = function(left, right) {
// Make values unsigned
@ -82,51 +131,65 @@ var lessThan = function(left, right) {
var uhright = right.high_ >>> 0;
// Compare high bits first
if(uhleft < uhright) {
return true
} else if(uhleft == uhright) {
if (uhleft < uhright) {
return true;
} else if (uhleft === uhright) {
var ulleft = left.low_ >>> 0;
var ulright = right.low_ >>> 0;
if(ulleft < ulright) return true;
if (ulleft < ulright) return true;
}
return false;
}
};
var longtoHex = function(value) {
var buffer = new Buffer(8);
var index = 0;
// Encode the low 64 bits of the decimal
// Encode low bits
buffer[index++] = value.low_ & 0xff;
buffer[index++] = (value.low_ >> 8) & 0xff;
buffer[index++] = (value.low_ >> 16) & 0xff;
buffer[index++] = (value.low_ >> 24) & 0xff;
// Encode high bits
buffer[index++] = value.high_ & 0xff;
buffer[index++] = (value.high_ >> 8) & 0xff;
buffer[index++] = (value.high_ >> 16) & 0xff;
buffer[index++] = (value.high_ >> 24) & 0xff;
return buffer.reverse().toString('hex');
}
// var longtoHex = function(value) {
// var buffer = new Buffer(8);
// var index = 0;
// // Encode the low 64 bits of the decimal
// // Encode low bits
// buffer[index++] = value.low_ & 0xff;
// buffer[index++] = (value.low_ >> 8) & 0xff;
// buffer[index++] = (value.low_ >> 16) & 0xff;
// buffer[index++] = (value.low_ >> 24) & 0xff;
// // Encode high bits
// buffer[index++] = value.high_ & 0xff;
// buffer[index++] = (value.high_ >> 8) & 0xff;
// buffer[index++] = (value.high_ >> 16) & 0xff;
// buffer[index++] = (value.high_ >> 24) & 0xff;
// return buffer.reverse().toString('hex');
// };
var int32toHex = function(value) {
var buffer = new Buffer(4);
var index = 0;
// Encode the low 64 bits of the decimal
// Encode low bits
buffer[index++] = value & 0xff;
buffer[index++] = (value >> 8) & 0xff;
buffer[index++] = (value >> 16) & 0xff;
buffer[index++] = (value >> 24) & 0xff;
return buffer.reverse().toString('hex');
}
// var int32toHex = function(value) {
// var buffer = new Buffer(4);
// var index = 0;
// // Encode the low 64 bits of the decimal
// // Encode low bits
// buffer[index++] = value & 0xff;
// buffer[index++] = (value >> 8) & 0xff;
// buffer[index++] = (value >> 16) & 0xff;
// buffer[index++] = (value >> 24) & 0xff;
// return buffer.reverse().toString('hex');
// };
/**
* A class representation of the BSON Decimal128 type.
*
* @class
* @param {Buffer} bytes a buffer containing the raw Decimal128 bytes.
* @return {Double}
*/
var Decimal128 = function(bytes) {
this._bsontype = 'Decimal128';
this.bytes = bytes;
}
};
/**
* Create a Decimal128 instance from a string representation
*
* @method
* @param {string} string a numeric string representation.
* @return {Decimal128} returns a Decimal128 instance.
*/
Decimal128.fromString = function(string) {
// Parse state tracking
var isNegative = false;
@ -172,41 +235,46 @@ Decimal128.fromString = function(string) {
// Trim the string
string = string.trim();
// Naively prevent against REDOS attacks.
// TODO: implementing a custom parsing for this, or refactoring the regex would yield
// further gains.
if (string.length >= 7000) {
throw new Error('' + string + ' not a valid Decimal128 string');
}
// Results
var stringMatch = string.match(PARSE_STRING_REGEXP);
var infMatch = string.match(PARSE_INF_REGEXP);
var nanMatch = string.match(PARSE_NAN_REGEXP);
// Validate the string
if(!stringMatch
&& ! infMatch
&& ! nanMatch || string.length == 0) {
throw new Error("" + string + " not a valid Decimal128 string");
if ((!stringMatch && !infMatch && !nanMatch) || string.length === 0) {
throw new Error('' + string + ' not a valid Decimal128 string');
}
// Check if we have an illegal exponent format
if(stringMatch && stringMatch[4] && stringMatch[2] === undefined) {
throw new Error("" + string + " not a valid Decimal128 string");
if (stringMatch && stringMatch[4] && stringMatch[2] === undefined) {
throw new Error('' + string + ' not a valid Decimal128 string');
}
// Get the negative or positive sign
if(string[index] == '+' || string[index] == '-') {
isNegative = string[index++] == '-';
if (string[index] === '+' || string[index] === '-') {
isNegative = string[index++] === '-';
}
// Check if user passed Infinity or NaN
if(!isDigit(string[index]) && string[index] != '.') {
if(string[index] == 'i' || string[index] == 'I') {
if (!isDigit(string[index]) && string[index] !== '.') {
if (string[index] === 'i' || string[index] === 'I') {
return new Decimal128(new Buffer(isNegative ? INF_NEGATIVE_BUFFER : INF_POSITIVE_BUFFER));
} else if(string[index] == 'N') {
} else if (string[index] === 'N') {
return new Decimal128(new Buffer(NAN_BUFFER));
}
}
// Read all the digits
while(isDigit(string[index]) || string[index] == '.') {
if(string[index] == '.') {
if(sawRadix) {
while (isDigit(string[index]) || string[index] === '.') {
if (string[index] === '.') {
if (sawRadix) {
return new Decimal128(new Buffer(NAN_BUFFER));
}
@ -215,9 +283,9 @@ Decimal128.fromString = function(string) {
continue;
}
if(nDigitsStored < 34) {
if(string[index] != '0' || foundNonZero) {
if(!foundNonZero) {
if (nDigitsStored < 34) {
if (string[index] !== '0' || foundNonZero) {
if (!foundNonZero) {
firstNonZero = nDigitsRead;
}
@ -229,11 +297,11 @@ Decimal128.fromString = function(string) {
}
}
if(foundNonZero) {
if (foundNonZero) {
nDigits = nDigits + 1;
}
if(sawRadix) {
if (sawRadix) {
radixPosition = radixPosition + 1;
}
@ -241,17 +309,17 @@ Decimal128.fromString = function(string) {
index = index + 1;
}
if(sawRadix && !nDigitsRead) {
throw new Error("" + string + " not a valid Decimal128 string");
if (sawRadix && !nDigitsRead) {
throw new Error('' + string + ' not a valid Decimal128 string');
}
// Read exponent if exists
if(string[index] == 'e' || string[index] == 'E') {
if (string[index] === 'e' || string[index] === 'E') {
// Read exponent digits
var match = string.substr(++index).match(EXPONENT_REGEX);
// No digits read
if(!match || !match[2]) {
if (!match || !match[2]) {
return new Decimal128(new Buffer(NAN_BUFFER));
}
@ -263,7 +331,7 @@ Decimal128.fromString = function(string) {
}
// Return not a number
if(string[index]) {
if (string[index]) {
return new Decimal128(new Buffer(NAN_BUFFER));
}
@ -271,7 +339,7 @@ Decimal128.fromString = function(string) {
// Find first non-zero digit in digits
firstDigit = 0;
if(!nDigitsStored) {
if (!nDigitsStored) {
firstDigit = 0;
lastDigit = 0;
digits[0] = 0;
@ -282,8 +350,8 @@ Decimal128.fromString = function(string) {
lastDigit = nDigitsStored - 1;
significantDigits = nDigits;
if(exponent != 0 && significantDigits != 1) {
while(string[firstNonZero + significantDigits - 1] == '0') {
if (exponent !== 0 && significantDigits !== 1) {
while (string[firstNonZero + significantDigits - 1] === '0') {
significantDigits = significantDigits - 1;
}
}
@ -294,21 +362,21 @@ Decimal128.fromString = function(string) {
// to represent user input
// Overflow prevention
if(exponent <= radixPosition && radixPosition - exponent > (1 << 14)) {
if (exponent <= radixPosition && radixPosition - exponent > 1 << 14) {
exponent = EXPONENT_MIN;
} else {
exponent = exponent - radixPosition;
}
// Attempt to normalize the exponent
while(exponent > EXPONENT_MAX) {
while (exponent > EXPONENT_MAX) {
// Shift exponent to significand and decrease
lastDigit = lastDigit + 1;
if(lastDigit - firstDigit > MAX_DIGITS) {
if (lastDigit - firstDigit > MAX_DIGITS) {
// Check if we have a zero then just hard clamp, otherwise fail
var digitsString = digits.join('');
if(digitsString.match(/^0+$/)) {
if (digitsString.match(/^0+$/)) {
exponent = EXPONENT_MAX;
break;
} else {
@ -319,15 +387,15 @@ Decimal128.fromString = function(string) {
exponent = exponent - 1;
}
while(exponent < EXPONENT_MIN || nDigitsStored < nDigits) {
while (exponent < EXPONENT_MIN || nDigitsStored < nDigits) {
// Shift last digit
if(lastDigit == 0) {
if (lastDigit === 0) {
exponent = EXPONENT_MIN;
significantDigits = 0;
break;
}
if(nDigitsStored < nDigits) {
if (nDigitsStored < nDigits) {
// adjust to match digits not stored
nDigits = nDigits - 1;
} else {
@ -335,30 +403,29 @@ Decimal128.fromString = function(string) {
lastDigit = lastDigit - 1;
}
if(exponent < EXPONENT_MAX) {
if (exponent < EXPONENT_MAX) {
exponent = exponent + 1;
} else {
// Check if we have a zero then just hard clamp, otherwise fail
var digitsString = digits.join('');
if(digitsString.match(/^0+$/)) {
digitsString = digits.join('');
if (digitsString.match(/^0+$/)) {
exponent = EXPONENT_MAX;
break;
} else {
return new Decimal128(new Buffer(isNegative ? INF_NEGATIVE_BUFFER : INF_POSITIVE_BUFFER))
return new Decimal128(new Buffer(isNegative ? INF_NEGATIVE_BUFFER : INF_POSITIVE_BUFFER));
}
}
}
// Round
// We've normalized the exponent, but might still need to round.
if((lastDigit - firstDigit + 1 < significantDigits) && string[significantDigits] != '0') {
if (lastDigit - firstDigit + 1 < significantDigits && string[significantDigits] !== '0') {
var endOfString = nDigitsRead;
// If we have seen a radix point, 'string' is 1 longer than we have
// documented with ndigits_read, so inc the position of the first nonzero
// digit and the position that digits are read to.
if(sawRadix && exponent == EXPONENT_MIN) {
if (sawRadix && exponent === EXPONENT_MIN) {
firstNonZero = firstNonZero + 1;
endOfString = endOfString + 1;
}
@ -366,14 +433,14 @@ Decimal128.fromString = function(string) {
var roundDigit = parseInt(string[firstNonZero + lastDigit + 1], 10);
var roundBit = 0;
if(roundDigit >= 5) {
if (roundDigit >= 5) {
roundBit = 1;
if(roundDigit == 5) {
roundBit = digits[lastDigit] % 2 == 1;
if (roundDigit === 5) {
roundBit = digits[lastDigit] % 2 === 1;
for(var i = firstNonZero + lastDigit + 2; i < endOfString; i++) {
if(parseInt(string[i], 10)) {
for (i = firstNonZero + lastDigit + 2; i < endOfString; i++) {
if (parseInt(string[i], 10)) {
roundBit = 1;
break;
}
@ -381,20 +448,22 @@ Decimal128.fromString = function(string) {
}
}
if(roundBit) {
if (roundBit) {
var dIdx = lastDigit;
for(; dIdx >= 0; dIdx--) {
if(++digits[dIdx] > 9) {
for (; dIdx >= 0; dIdx--) {
if (++digits[dIdx] > 9) {
digits[dIdx] = 0;
// overflowed most significant digit
if(dIdx == 0) {
if(exponent < EXPONENT_MAX) {
if (dIdx === 0) {
if (exponent < EXPONENT_MAX) {
exponent = exponent + 1;
digits[dIdx] = 1;
} else {
return new Decimal128(new Buffer(isNegative ? INF_NEGATIVE_BUFFER : INF_POSITIVE_BUFFER))
return new Decimal128(
new Buffer(isNegative ? INF_NEGATIVE_BUFFER : INF_POSITIVE_BUFFER)
);
}
}
} else {
@ -411,52 +480,59 @@ Decimal128.fromString = function(string) {
significandLow = Long.fromNumber(0);
// read a zero
if(significantDigits == 0) {
if (significantDigits === 0) {
significandHigh = Long.fromNumber(0);
significandLow = Long.fromNumber(0);
} else if(lastDigit - firstDigit < 17) {
var dIdx = firstDigit;
} else if (lastDigit - firstDigit < 17) {
dIdx = firstDigit;
significandLow = Long.fromNumber(digits[dIdx++]);
significandHigh = new Long(0, 0);
for(; dIdx <= lastDigit; dIdx++) {
for (; dIdx <= lastDigit; dIdx++) {
significandLow = significandLow.multiply(Long.fromNumber(10));
significandLow = significandLow.add(Long.fromNumber(digits[dIdx]));
}
} else {
var dIdx = firstDigit;
dIdx = firstDigit;
significandHigh = Long.fromNumber(digits[dIdx++]);
for(; dIdx <= lastDigit - 17; dIdx++) {
for (; dIdx <= lastDigit - 17; dIdx++) {
significandHigh = significandHigh.multiply(Long.fromNumber(10));
significandHigh = significandHigh.add(Long.fromNumber(digits[dIdx]));
}
significandLow = Long.fromNumber(digits[dIdx++]);
for(; dIdx <= lastDigit; dIdx++) {
for (; dIdx <= lastDigit; dIdx++) {
significandLow = significandLow.multiply(Long.fromNumber(10));
significandLow = significandLow.add(Long.fromNumber(digits[dIdx]));
}
}
var significand = multiply64x2(significandHigh, Long.fromString("100000000000000000"));
var significand = multiply64x2(significandHigh, Long.fromString('100000000000000000'));
significand.low = significand.low.add(significandLow);
if(lessThan(significand.low, significandLow)) {
if (lessThan(significand.low, significandLow)) {
significand.high = significand.high.add(Long.fromNumber(1));
}
// Biased exponent
var biasedExponent = (exponent + EXPONENT_BIAS);
biasedExponent = exponent + EXPONENT_BIAS;
var dec = { low: Long.fromNumber(0), high: Long.fromNumber(0) };
// Encode combination, exponent, and significand.
if(significand.high.shiftRightUnsigned(49).and(Long.fromNumber(1)).equals(Long.fromNumber)) {
if (
significand.high
.shiftRightUnsigned(49)
.and(Long.fromNumber(1))
.equals(Long.fromNumber)
) {
// Encode '11' into bits 1 to 3
dec.high = dec.high.or(Long.fromNumber(0x3).shiftLeft(61));
dec.high = dec.high.or(Long.fromNumber(biasedExponent).and(Long.fromNumber(0x3fff).shiftLeft(47)));
dec.high = dec.high.or(
Long.fromNumber(biasedExponent).and(Long.fromNumber(0x3fff).shiftLeft(47))
);
dec.high = dec.high.or(significand.high.and(Long.fromNumber(0x7fffffffffff)));
} else {
dec.high = dec.high.or(Long.fromNumber(biasedExponent & 0x3fff).shiftLeft(49));
@ -466,13 +542,13 @@ Decimal128.fromString = function(string) {
dec.low = significand.low;
// Encode sign
if(isNegative) {
if (isNegative) {
dec.high = dec.high.or(Long.fromString('9223372036854775808'));
}
// Encode into a buffer
var buffer = new Buffer(16);
var index = 0;
index = 0;
// Encode the low 64 bits of the decimal
// Encode low bits
@ -500,7 +576,7 @@ Decimal128.fromString = function(string) {
// Return the new Decimal128
return new Decimal128(buffer);
}
};
// Extract least significant 5 bits
var COMBINATION_MASK = 0x1f;
@ -511,10 +587,16 @@ var COMBINATION_INFINITY = 30;
// Value of combination field for NaN
var COMBINATION_NAN = 31;
// Value of combination field for NaN
var COMBINATION_SNAN = 32;
// var COMBINATION_SNAN = 32;
// decimal128 exponent bias
var EXPONENT_BIAS = 6176;
EXPONENT_BIAS = 6176;
/**
* Create a string representation of the raw Decimal128 value
*
* @method
* @return {string} returns a Decimal128 string representation.
*/
Decimal128.prototype.toString = function() {
// Note: bits in this routine are referred to starting at 0,
// from the sign bit, towards the coefficient.
@ -535,7 +617,7 @@ Decimal128.prototype.toString = function() {
var significand_digits = 0;
// the base-10 digits in the significand
var significand = new Array(36);
for(var i = 0; i < significand.length; i++) significand[i] = 0;
for (var i = 0; i < significand.length; i++) significand[i] = 0;
// read pointer into significand
var index = 0;
@ -550,49 +632,54 @@ Decimal128.prototype.toString = function() {
// the most signifcant significand bits (50-46)
var significand_msb;
// temporary storage for significand decoding
var significand128 = {parts: new Array(4)};
var significand128 = { parts: new Array(4) };
// indexing variables
var i;
i;
var j, k;
// Output string
var string = [];
// Unpack index
var index = 0;
index = 0;
// Buffer reference
var buffer = this.bytes;
// Unpack the low 64bits into a long
low = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
midl = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
low =
buffer[index++] | (buffer[index++] << 8) | (buffer[index++] << 16) | (buffer[index++] << 24);
midl =
buffer[index++] | (buffer[index++] << 8) | (buffer[index++] << 16) | (buffer[index++] << 24);
// Unpack the high 64bits into a long
midh = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
high = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
midh =
buffer[index++] | (buffer[index++] << 8) | (buffer[index++] << 16) | (buffer[index++] << 24);
high =
buffer[index++] | (buffer[index++] << 8) | (buffer[index++] << 16) | (buffer[index++] << 24);
// Unpack index
var index = 0;
index = 0;
// Create the state of the decimal
var dec = {
low: new Long(low, midl),
high: new Long(midh, high) };
high: new Long(midh, high)
};
if(dec.high.lessThan(Long.ZERO)) {
if (dec.high.lessThan(Long.ZERO)) {
string.push('-');
}
// Decode combination field and exponent
combination = (high >> 26) & COMBINATION_MASK;
if((combination >> 3) == 3) {
if (combination >> 3 === 3) {
// Check for 'special' values
if(combination == COMBINATION_INFINITY) {
return string.join('') + "Infinity";
} else if(combination == COMBINATION_NAN) {
return "NaN";
if (combination === COMBINATION_INFINITY) {
return string.join('') + 'Infinity';
} else if (combination === COMBINATION_NAN) {
return 'NaN';
} else {
biased_exponent = (high >> 15) & EXPONENT_MASK;
significand_msb = 0x08 + ((high >> 14) & 0x01);
@ -614,11 +701,15 @@ Decimal128.prototype.toString = function() {
significand128.parts[2] = midl;
significand128.parts[3] = low;
if(significand128.parts[0] == 0 && significand128.parts[1] == 0
&& significand128.parts[2] == 0 && significand128.parts[3] == 0) {
is_zero = true;
if (
significand128.parts[0] === 0 &&
significand128.parts[1] === 0 &&
significand128.parts[2] === 0 &&
significand128.parts[3] === 0
) {
is_zero = true;
} else {
for(var k = 3; k >= 0; k--) {
for (k = 3; k >= 0; k--) {
var least_digits = 0;
// Peform the divide
var result = divideu128(significand128);
@ -627,9 +718,9 @@ Decimal128.prototype.toString = function() {
// We now have the 9 least significant digits (in base 2).
// Convert and output to string.
if(!least_digits) continue;
if (!least_digits) continue;
for(var j = 8; j >= 0; j--) {
for (j = 8; j >= 0; j--) {
// significand[k * 9 + j] = Math.round(least_digits % 10);
significand[k * 9 + j] = least_digits % 10;
// least_digits = Math.round(least_digits / 10);
@ -642,14 +733,14 @@ Decimal128.prototype.toString = function() {
// Scientific - [-]d.dddE(+/-)dd or [-]dE(+/-)dd
// Regular - ddd.ddd
if(is_zero) {
if (is_zero) {
significand_digits = 1;
significand[index] = 0;
} else {
significand_digits = 36;
var i = 0;
i = 0;
while(!significand[index]) {
while (!significand[index]) {
i++;
significand_digits = significand_digits - 1;
index = index + 1;
@ -666,39 +757,38 @@ Decimal128.prototype.toString = function() {
// because doing so would change the precision of the value, and would
// change stored data if the string converted number is round tripped.
if(scientific_exponent >= 34 || scientific_exponent <= -7 ||
exponent > 0) {
if (scientific_exponent >= 34 || scientific_exponent <= -7 || exponent > 0) {
// Scientific format
string.push(significand[index++]);
significand_digits = significand_digits - 1;
if(significand_digits) {
if (significand_digits) {
string.push('.');
}
for(var i = 0; i < significand_digits; i++) {
for (i = 0; i < significand_digits; i++) {
string.push(significand[index++]);
}
// Exponent
string.push('E');
if(scientific_exponent > 0) {
if (scientific_exponent > 0) {
string.push('+' + scientific_exponent);
} else {
string.push(scientific_exponent);
}
} else {
// Regular format with no decimal place
if(exponent >= 0) {
for(var i = 0; i < significand_digits; i++) {
if (exponent >= 0) {
for (i = 0; i < significand_digits; i++) {
string.push(significand[index++]);
}
} else {
var radix_position = significand_digits + exponent;
// non-zero digits before radix
if(radix_position > 0) {
for(var i = 0; i < radix_position; i++) {
if (radix_position > 0) {
for (i = 0; i < radix_position; i++) {
string.push(significand[index++]);
}
} else {
@ -707,22 +797,22 @@ Decimal128.prototype.toString = function() {
string.push('.');
// add leading zeros after radix
while(radix_position++ < 0) {
while (radix_position++ < 0) {
string.push('0');
}
for(var i = 0; i < significand_digits - Math.max(radix_position - 1, 0); i++) {
for (i = 0; i < significand_digits - Math.max(radix_position - 1, 0); i++) {
string.push(significand[index++]);
}
}
}
return string.join('');
}
};
Decimal128.prototype.toJSON = function() {
return { "$numberDecimal": this.toString() };
}
return { $numberDecimal: this.toString() };
};
module.exports = Decimal128;
module.exports.Decimal128 = Decimal128;

View file

@ -6,7 +6,7 @@
* @return {Double}
*/
function Double(value) {
if(!(this instanceof Double)) return new Double(value);
if (!(this instanceof Double)) return new Double(value);
this._bsontype = 'Double';
this.value = value;
@ -27,7 +27,7 @@ Double.prototype.valueOf = function() {
*/
Double.prototype.toJSON = function() {
return this.value;
}
};
module.exports = Double;
module.exports.Double = Double;

View file

@ -1,20 +1,20 @@
// Copyright (c) 2008, Fair Oaks Labs, Inc.
// All rights reserved.
//
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions are met:
//
//
// * Redistributions of source code must retain the above copyright notice,
// this list of conditions and the following disclaimer.
//
//
// * Redistributions in binary form must reproduce the above copyright notice,
// this list of conditions and the following disclaimer in the documentation
// and/or other materials provided with the distribution.
//
//
// * Neither the name of Fair Oaks Labs, Inc. nor the names of its contributors
// may be used to endorse or promote products derived from this software
// without specific prior written permission.
//
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
// AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
@ -31,32 +31,33 @@
// Modifications to writeIEEE754 to support negative zeroes made by Brian White
var readIEEE754 = function(buffer, offset, endian, mLen, nBytes) {
var e, m,
bBE = (endian === 'big'),
eLen = nBytes * 8 - mLen - 1,
eMax = (1 << eLen) - 1,
eBias = eMax >> 1,
nBits = -7,
i = bBE ? 0 : (nBytes - 1),
d = bBE ? 1 : -1,
s = buffer[offset + i];
var e,
m,
bBE = endian === 'big',
eLen = nBytes * 8 - mLen - 1,
eMax = (1 << eLen) - 1,
eBias = eMax >> 1,
nBits = -7,
i = bBE ? 0 : nBytes - 1,
d = bBE ? 1 : -1,
s = buffer[offset + i];
i += d;
e = s & ((1 << (-nBits)) - 1);
s >>= (-nBits);
e = s & ((1 << -nBits) - 1);
s >>= -nBits;
nBits += eLen;
for (; nBits > 0; e = e * 256 + buffer[offset + i], i += d, nBits -= 8);
m = e & ((1 << (-nBits)) - 1);
e >>= (-nBits);
m = e & ((1 << -nBits) - 1);
e >>= -nBits;
nBits += mLen;
for (; nBits > 0; m = m * 256 + buffer[offset + i], i += d, nBits -= 8);
if (e === 0) {
e = 1 - eBias;
} else if (e === eMax) {
return m ? NaN : ((s ? -1 : 1) * Infinity);
return m ? NaN : (s ? -1 : 1) * Infinity;
} else {
m = m + Math.pow(2, mLen);
e = e - eBias;
@ -65,15 +66,17 @@ var readIEEE754 = function(buffer, offset, endian, mLen, nBytes) {
};
var writeIEEE754 = function(buffer, value, offset, endian, mLen, nBytes) {
var e, m, c,
bBE = (endian === 'big'),
eLen = nBytes * 8 - mLen - 1,
eMax = (1 << eLen) - 1,
eBias = eMax >> 1,
rt = (mLen === 23 ? Math.pow(2, -24) - Math.pow(2, -77) : 0),
i = bBE ? (nBytes-1) : 0,
d = bBE ? -1 : 1,
s = value < 0 || (value === 0 && 1 / value < 0) ? 1 : 0;
var e,
m,
c,
bBE = endian === 'big',
eLen = nBytes * 8 - mLen - 1,
eMax = (1 << eLen) - 1,
eBias = eMax >> 1,
rt = mLen === 23 ? Math.pow(2, -24) - Math.pow(2, -77) : 0,
i = bBE ? nBytes - 1 : 0,
d = bBE ? -1 : 1,
s = value < 0 || (value === 0 && 1 / value < 0) ? 1 : 0;
value = Math.abs(value);
@ -86,7 +89,7 @@ var writeIEEE754 = function(buffer, value, offset, endian, mLen, nBytes) {
e--;
c *= 2;
}
if (e+eBias >= 1) {
if (e + eBias >= 1) {
value += rt / c;
} else {
value += rt * Math.pow(2, 1 - eBias);
@ -118,4 +121,4 @@ var writeIEEE754 = function(buffer, value, offset, endian, mLen, nBytes) {
};
exports.readIEEE754 = readIEEE754;
exports.writeIEEE754 = writeIEEE754;
exports.writeIEEE754 = writeIEEE754;

View file

@ -1,90 +0,0 @@
try {
exports.BSONPure = require('./bson');
exports.BSONNative = require('./bson');
} catch(err) {
}
[ 'binary'
, 'code'
, 'map'
, 'db_ref'
, 'double'
, 'int_32'
, 'max_key'
, 'min_key'
, 'objectid'
, 'regexp'
, 'symbol'
, 'decimal128'
, 'timestamp'
, 'long'
, 'bson'].forEach(function (path) {
var module = require('./' + path);
for (var i in module) {
exports[i] = module[i];
}
});
// Exports all the classes for the PURE JS BSON Parser
exports.pure = function() {
var classes = {};
// Map all the classes
[ 'binary'
, 'code'
, 'map'
, 'db_ref'
, 'double'
, 'int_32'
, 'max_key'
, 'min_key'
, 'objectid'
, 'regexp'
, 'symbol'
, 'decimal128'
, 'timestamp'
, 'long'
, 'bson'].forEach(function (path) {
var module = require('./' + path);
for (var i in module) {
classes[i] = module[i];
}
});
// Return classes list
return classes;
}
// Exports all the classes for the NATIVE JS BSON Parser
exports.native = function() {
var classes = {};
// Map all the classes
[ 'binary'
, 'code'
, 'map'
, 'db_ref'
, 'double'
, 'int_32'
, 'max_key'
, 'min_key'
, 'objectid'
, 'regexp'
, 'symbol'
, 'decimal128'
, 'timestamp'
, 'long'
, 'bson'].forEach(function (path) {
var module = require('./' + path);
for (var i in module) {
classes[i] = module[i];
}
});
// Catch error and return no classes found
try {
classes['BSON'] = require('./bson');
} catch(err) {
return exports.pure();
}
// Return classes list
return classes;
}

View file

@ -1,9 +1,16 @@
/**
* A class representation of a BSON Int32 type.
*
* @class
* @param {number} value the number we want to represent as an int32.
* @return {Int32}
*/
var Int32 = function(value) {
if(!(this instanceof Int32)) return new Int32(value);
if (!(this instanceof Int32)) return new Int32(value);
this._bsontype = 'Int32';
this.value = value;
}
};
/**
* Access the number value.
@ -20,7 +27,7 @@ Int32.prototype.valueOf = function() {
*/
Int32.prototype.toJSON = function() {
return this.value;
}
};
module.exports = Int32;
module.exports.Int32 = Int32;

View file

@ -41,21 +41,21 @@
* @return {Long}
*/
function Long(low, high) {
if(!(this instanceof Long)) return new Long(low, high);
if (!(this instanceof Long)) return new Long(low, high);
this._bsontype = 'Long';
/**
* @type {number}
* @ignore
*/
this.low_ = low | 0; // force into 32 signed bits.
this.low_ = low | 0; // force into 32 signed bits.
/**
* @type {number}
* @ignore
*/
this.high_ = high | 0; // force into 32 signed bits.
};
this.high_ = high | 0; // force into 32 signed bits.
}
/**
* Return the int value.
@ -74,8 +74,7 @@ Long.prototype.toInt = function() {
* @return {number} the closest floating-point representation to this value.
*/
Long.prototype.toNumber = function() {
return this.high_ * Long.TWO_PWR_32_DBL_ +
this.getLowBitsUnsigned();
return this.high_ * Long.TWO_PWR_32_DBL_ + this.getLowBitsUnsigned();
};
/**
@ -86,7 +85,7 @@ Long.prototype.toNumber = function() {
*/
Long.prototype.toJSON = function() {
return this.toString();
}
};
/**
* Return the String value.
@ -122,9 +121,10 @@ Long.prototype.toString = function(opt_radix) {
// minimize the calls to the very expensive emulated div.
var radixToPower = Long.fromNumber(Math.pow(radix, 6));
var rem = this;
rem = this;
var result = '';
while (true) {
while (!rem.isZero()) {
var remDiv = rem.div(radixToPower);
var intval = rem.subtract(remDiv.multiply(radixToPower)).toInt();
var digits = intval.toString(radix);
@ -168,8 +168,7 @@ Long.prototype.getLowBits = function() {
* @return {number} the low 32-bits as an unsigned value.
*/
Long.prototype.getLowBitsUnsigned = function() {
return (this.low_ >= 0) ?
this.low_ : Long.TWO_PWR_32_DBL_ + this.low_;
return this.low_ >= 0 ? this.low_ : Long.TWO_PWR_32_DBL_ + this.low_;
};
/**
@ -186,13 +185,13 @@ Long.prototype.getNumBitsAbs = function() {
return this.negate().getNumBitsAbs();
}
} else {
var val = this.high_ != 0 ? this.high_ : this.low_;
var val = this.high_ !== 0 ? this.high_ : this.low_;
for (var bit = 31; bit > 0; bit--) {
if ((val & (1 << bit)) != 0) {
if ((val & (1 << bit)) !== 0) {
break;
}
}
return this.high_ != 0 ? bit + 33 : bit + 1;
return this.high_ !== 0 ? bit + 33 : bit + 1;
}
};
@ -203,7 +202,7 @@ Long.prototype.getNumBitsAbs = function() {
* @return {boolean} whether this value is zero.
*/
Long.prototype.isZero = function() {
return this.high_ == 0 && this.low_ == 0;
return this.high_ === 0 && this.low_ === 0;
};
/**
@ -223,7 +222,7 @@ Long.prototype.isNegative = function() {
* @return {boolean} whether this value is odd.
*/
Long.prototype.isOdd = function() {
return (this.low_ & 1) == 1;
return (this.low_ & 1) === 1;
};
/**
@ -234,7 +233,7 @@ Long.prototype.isOdd = function() {
* @return {boolean} whether this Long equals the other
*/
Long.prototype.equals = function(other) {
return (this.high_ == other.high_) && (this.low_ == other.low_);
return this.high_ === other.high_ && this.low_ === other.low_;
};
/**
@ -245,7 +244,7 @@ Long.prototype.equals = function(other) {
* @return {boolean} whether this Long does not equal the other.
*/
Long.prototype.notEquals = function(other) {
return (this.high_ != other.high_) || (this.low_ != other.low_);
return this.high_ !== other.high_ || this.low_ !== other.low_;
};
/**
@ -346,27 +345,30 @@ Long.prototype.add = function(other) {
// Divide each number into 4 chunks of 16 bits, and then sum the chunks.
var a48 = this.high_ >>> 16;
var a32 = this.high_ & 0xFFFF;
var a32 = this.high_ & 0xffff;
var a16 = this.low_ >>> 16;
var a00 = this.low_ & 0xFFFF;
var a00 = this.low_ & 0xffff;
var b48 = other.high_ >>> 16;
var b32 = other.high_ & 0xFFFF;
var b32 = other.high_ & 0xffff;
var b16 = other.low_ >>> 16;
var b00 = other.low_ & 0xFFFF;
var b00 = other.low_ & 0xffff;
var c48 = 0, c32 = 0, c16 = 0, c00 = 0;
var c48 = 0,
c32 = 0,
c16 = 0,
c00 = 0;
c00 += a00 + b00;
c16 += c00 >>> 16;
c00 &= 0xFFFF;
c00 &= 0xffff;
c16 += a16 + b16;
c32 += c16 >>> 16;
c16 &= 0xFFFF;
c16 &= 0xffff;
c32 += a32 + b32;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c48 += a48 + b48;
c48 &= 0xFFFF;
c48 &= 0xffff;
return Long.fromBits((c16 << 16) | c00, (c48 << 16) | c32);
};
@ -405,15 +407,16 @@ Long.prototype.multiply = function(other) {
if (other.isNegative()) {
return this.negate().multiply(other.negate());
} else {
return this.negate().multiply(other).negate();
return this.negate()
.multiply(other)
.negate();
}
} else if (other.isNegative()) {
return this.multiply(other.negate()).negate();
}
// If both Longs are small, use float multiplication
if (this.lessThan(Long.TWO_PWR_24_) &&
other.lessThan(Long.TWO_PWR_24_)) {
if (this.lessThan(Long.TWO_PWR_24_) && other.lessThan(Long.TWO_PWR_24_)) {
return Long.fromNumber(this.toNumber() * other.toNumber());
}
@ -421,36 +424,39 @@ Long.prototype.multiply = function(other) {
// We can skip products that would overflow.
var a48 = this.high_ >>> 16;
var a32 = this.high_ & 0xFFFF;
var a32 = this.high_ & 0xffff;
var a16 = this.low_ >>> 16;
var a00 = this.low_ & 0xFFFF;
var a00 = this.low_ & 0xffff;
var b48 = other.high_ >>> 16;
var b32 = other.high_ & 0xFFFF;
var b32 = other.high_ & 0xffff;
var b16 = other.low_ >>> 16;
var b00 = other.low_ & 0xFFFF;
var b00 = other.low_ & 0xffff;
var c48 = 0, c32 = 0, c16 = 0, c00 = 0;
var c48 = 0,
c32 = 0,
c16 = 0,
c00 = 0;
c00 += a00 * b00;
c16 += c00 >>> 16;
c00 &= 0xFFFF;
c00 &= 0xffff;
c16 += a16 * b00;
c32 += c16 >>> 16;
c16 &= 0xFFFF;
c16 &= 0xffff;
c16 += a00 * b16;
c32 += c16 >>> 16;
c16 &= 0xFFFF;
c16 &= 0xffff;
c32 += a32 * b00;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c32 += a16 * b16;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c32 += a00 * b32;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c48 += a48 * b00 + a32 * b16 + a16 * b32 + a00 * b48;
c48 &= 0xFFFF;
c48 &= 0xffff;
return Long.fromBits((c16 << 16) | c00, (c48 << 16) | c32);
};
@ -469,9 +475,8 @@ Long.prototype.div = function(other) {
}
if (this.equals(Long.MIN_VALUE)) {
if (other.equals(Long.ONE) ||
other.equals(Long.NEG_ONE)) {
return Long.MIN_VALUE; // recall that -MIN_VALUE == MIN_VALUE
if (other.equals(Long.ONE) || other.equals(Long.NEG_ONE)) {
return Long.MIN_VALUE; // recall that -MIN_VALUE == MIN_VALUE
} else if (other.equals(Long.MIN_VALUE)) {
return Long.ONE;
} else {
@ -494,7 +499,9 @@ Long.prototype.div = function(other) {
if (other.isNegative()) {
return this.negate().div(other.negate());
} else {
return this.negate().div(other).negate();
return this.negate()
.div(other)
.negate();
}
} else if (other.isNegative()) {
return this.div(other.negate()).negate();
@ -506,16 +513,16 @@ Long.prototype.div = function(other) {
// the approximate value is less than or equal to the real value so that the
// remainder never becomes negative.
var res = Long.ZERO;
var rem = this;
rem = this;
while (rem.greaterThanOrEqual(other)) {
// Approximate the result of division. This may be a little greater or
// smaller than the actual value.
var approx = Math.max(1, Math.floor(rem.toNumber() / other.toNumber()));
approx = Math.max(1, Math.floor(rem.toNumber() / other.toNumber()));
// We will tweak the approximate result by changing it in the 48-th digit or
// the smallest non-fractional digit, whichever is larger.
var log2 = Math.ceil(Math.log(approx) / Math.LN2);
var delta = (log2 <= 48) ? 1 : Math.pow(2, log2 - 48);
var delta = log2 <= 48 ? 1 : Math.pow(2, log2 - 48);
// Decrease the approximation until it is smaller than the remainder. Note
// that if it is too large, the product overflows and is negative.
@ -602,15 +609,13 @@ Long.prototype.xor = function(other) {
*/
Long.prototype.shiftLeft = function(numBits) {
numBits &= 63;
if (numBits == 0) {
if (numBits === 0) {
return this;
} else {
var low = this.low_;
if (numBits < 32) {
var high = this.high_;
return Long.fromBits(
low << numBits,
(high << numBits) | (low >>> (32 - numBits)));
return Long.fromBits(low << numBits, (high << numBits) | (low >>> (32 - numBits)));
} else {
return Long.fromBits(0, low << (numBits - 32));
}
@ -626,19 +631,15 @@ Long.prototype.shiftLeft = function(numBits) {
*/
Long.prototype.shiftRight = function(numBits) {
numBits &= 63;
if (numBits == 0) {
if (numBits === 0) {
return this;
} else {
var high = this.high_;
if (numBits < 32) {
var low = this.low_;
return Long.fromBits(
(low >>> numBits) | (high << (32 - numBits)),
high >> numBits);
return Long.fromBits((low >>> numBits) | (high << (32 - numBits)), high >> numBits);
} else {
return Long.fromBits(
high >> (numBits - 32),
high >= 0 ? 0 : -1);
return Long.fromBits(high >> (numBits - 32), high >= 0 ? 0 : -1);
}
}
};
@ -652,16 +653,14 @@ Long.prototype.shiftRight = function(numBits) {
*/
Long.prototype.shiftRightUnsigned = function(numBits) {
numBits &= 63;
if (numBits == 0) {
if (numBits === 0) {
return this;
} else {
var high = this.high_;
if (numBits < 32) {
var low = this.low_;
return Long.fromBits(
(low >>> numBits) | (high << (32 - numBits)),
high >>> numBits);
} else if (numBits == 32) {
return Long.fromBits((low >>> numBits) | (high << (32 - numBits)), high >>> numBits);
} else if (numBits === 32) {
return Long.fromBits(high, 0);
} else {
return Long.fromBits(high >>> (numBits - 32), 0);
@ -708,9 +707,7 @@ Long.fromNumber = function(value) {
} else if (value < 0) {
return Long.fromNumber(-value).negate();
} else {
return new Long(
(value % Long.TWO_PWR_32_DBL_) | 0,
(value / Long.TWO_PWR_32_DBL_) | 0);
return new Long((value % Long.TWO_PWR_32_DBL_) | 0, (value / Long.TWO_PWR_32_DBL_) | 0);
}
};
@ -735,7 +732,7 @@ Long.fromBits = function(lowBits, highBits) {
* @return {Long} the corresponding Long value.
*/
Long.fromString = function(str, opt_radix) {
if (str.length == 0) {
if (str.length === 0) {
throw Error('number format error: empty string');
}
@ -744,7 +741,7 @@ Long.fromString = function(str, opt_radix) {
throw Error('radix out of range: ' + radix);
}
if (str.charAt(0) == '-') {
if (str.charAt(0) === '-') {
return Long.fromString(str.substring(1), radix).negate();
} else if (str.indexOf('-') >= 0) {
throw Error('number format error: interior "-" character: ' + str);
@ -772,7 +769,6 @@ Long.fromString = function(str, opt_radix) {
// NOTE: Common constant values ZERO, ONE, NEG_ONE, etc. are defined below the
// from* methods on which they depend.
/**
* A cache of the Long representations of small integer values.
* @type {Object}
@ -837,8 +833,7 @@ Long.ONE = Long.fromInt(1);
Long.NEG_ONE = Long.fromInt(-1);
/** @type {Long} */
Long.MAX_VALUE =
Long.fromBits(0xFFFFFFFF | 0, 0x7FFFFFFF | 0);
Long.MAX_VALUE = Long.fromBits(0xffffffff | 0, 0x7fffffff | 0);
/** @type {Long} */
Long.MIN_VALUE = Long.fromBits(0, 0x80000000 | 0);
@ -853,4 +848,4 @@ Long.TWO_PWR_24_ = Long.fromInt(1 << 24);
* Expose.
*/
module.exports = Long;
module.exports.Long = Long;
module.exports.Long = Long;

View file

@ -1,7 +1,7 @@
"use strict"
'use strict';
// We have an ES6 Map available, return the native instance
if(typeof global.Map !== 'undefined') {
if (typeof global.Map !== 'undefined') {
module.exports = global.Map;
module.exports.Map = global.Map;
} else {
@ -10,8 +10,8 @@ if(typeof global.Map !== 'undefined') {
this._keys = [];
this._values = {};
for(var i = 0; i < array.length; i++) {
if(array[i] == null) continue; // skip null and undefined
for (var i = 0; i < array.length; i++) {
if (array[i] == null) continue; // skip null and undefined
var entry = array[i];
var key = entry[0];
var value = entry[1];
@ -19,24 +19,24 @@ if(typeof global.Map !== 'undefined') {
this._keys.push(key);
// Add the key and value to the values dictionary with a point
// to the location in the ordered keys list
this._values[key] = {v: value, i: this._keys.length - 1};
this._values[key] = { v: value, i: this._keys.length - 1 };
}
}
};
Map.prototype.clear = function() {
this._keys = [];
this._values = {};
}
};
Map.prototype.delete = function(key) {
var value = this._values[key];
if(value == null) return false;
if (value == null) return false;
// Delete entry
delete this._values[key];
// Remove the key from the ordered keys list
this._keys.splice(value.i, 1);
return true;
}
};
Map.prototype.entries = function() {
var self = this;
@ -48,30 +48,30 @@ if(typeof global.Map !== 'undefined') {
return {
value: key !== undefined ? [key, self._values[key].v] : undefined,
done: key !== undefined ? false : true
}
};
}
};
}
};
Map.prototype.forEach = function(callback, self) {
self = self || this;
for(var i = 0; i < this._keys.length; i++) {
for (var i = 0; i < this._keys.length; i++) {
var key = this._keys[i];
// Call the forEach callback
callback.call(self, this._values[key].v, key, self);
}
}
};
Map.prototype.get = function(key) {
return this._values[key] ? this._values[key].v : undefined;
}
};
Map.prototype.has = function(key) {
return this._values[key] != null;
}
};
Map.prototype.keys = function(key) {
Map.prototype.keys = function() {
var self = this;
var index = 0;
@ -81,13 +81,13 @@ if(typeof global.Map !== 'undefined') {
return {
value: key !== undefined ? key : undefined,
done: key !== undefined ? false : true
}
};
}
};
}
};
Map.prototype.set = function(key, value) {
if(this._values[key]) {
if (this._values[key]) {
this._values[key].v = value;
return this;
}
@ -96,11 +96,11 @@ if(typeof global.Map !== 'undefined') {
this._keys.push(key);
// Add the key and value to the values dictionary with a point
// to the location in the ordered keys list
this._values[key] = {v: value, i: this._keys.length - 1};
this._values[key] = { v: value, i: this._keys.length - 1 };
return this;
}
};
Map.prototype.values = function(key, value) {
Map.prototype.values = function() {
var self = this;
var index = 0;
@ -110,17 +110,19 @@ if(typeof global.Map !== 'undefined') {
return {
value: key !== undefined ? self._values[key].v : undefined,
done: key !== undefined ? false : true
}
};
}
};
}
};
// Last ismaster
Object.defineProperty(Map.prototype, 'size', {
enumerable:true,
get: function() { return this._keys.length; }
enumerable: true,
get: function() {
return this._keys.length;
}
});
module.exports = Map;
module.exports.Map = Map;
}
}

View file

@ -5,10 +5,10 @@
* @return {MaxKey} A MaxKey instance
*/
function MaxKey() {
if(!(this instanceof MaxKey)) return new MaxKey();
this._bsontype = 'MaxKey';
if (!(this instanceof MaxKey)) return new MaxKey();
this._bsontype = 'MaxKey';
}
module.exports = MaxKey;
module.exports.MaxKey = MaxKey;
module.exports.MaxKey = MaxKey;

View file

@ -5,10 +5,10 @@
* @return {MinKey} A MinKey instance
*/
function MinKey() {
if(!(this instanceof MinKey)) return new MinKey();
if (!(this instanceof MinKey)) return new MinKey();
this._bsontype = 'MinKey';
}
module.exports = MinKey;
module.exports.MinKey = MinKey;
module.exports.MinKey = MinKey;

View file

@ -1,3 +1,6 @@
// Custom inspect property name / symbol.
var inspect = 'inspect';
/**
* Machine id.
*
@ -6,10 +9,20 @@
* that would mean an asyc call to gethostname, so we don't bother.
* @ignore
*/
var MACHINE_ID = parseInt(Math.random() * 0xFFFFFF, 10);
var MACHINE_ID = parseInt(Math.random() * 0xffffff, 10);
// Regular expression that checks for hex value
var checkForHexRegExp = new RegExp("^[0-9a-fA-F]{24}$");
var checkForHexRegExp = new RegExp('^[0-9a-fA-F]{24}$');
// Check if buffer exists
try {
if (Buffer && Buffer.from) {
var hasBufferType = true;
inspect = require('util').inspect.custom || 'inspect';
}
} catch (err) {
hasBufferType = false;
}
/**
* Create a new ObjectID instance
@ -21,37 +34,50 @@ var checkForHexRegExp = new RegExp("^[0-9a-fA-F]{24}$");
*/
var ObjectID = function ObjectID(id) {
// Duck-typing to support ObjectId from different npm packages
if(id instanceof ObjectID) return id;
if(!(this instanceof ObjectID)) return new ObjectID(id);
if (id instanceof ObjectID) return id;
if (!(this instanceof ObjectID)) return new ObjectID(id);
this._bsontype = 'ObjectID';
var __id = null;
// The most common usecase (blank id, new objectId instance)
if (id == null || typeof id === 'number') {
// Generate a new id
this.id = this.generate(id);
// If we are caching the hex string
if (ObjectID.cacheHexString) this.__id = this.toString('hex');
// Return the object
return;
}
// Check if the passed in id is valid
var valid = ObjectID.isValid(id);
// Throw an error if it's not a valid setup
if(!valid && id != null){
throw new Error("Argument passed in must be a single String of 12 bytes or a string of 24 hex characters");
} else if(valid && typeof id == 'string' && id.length == 24) {
if (!valid && id != null) {
throw new Error(
'Argument passed in must be a single String of 12 bytes or a string of 24 hex characters'
);
} else if (valid && typeof id === 'string' && id.length === 24 && hasBufferType) {
return new ObjectID(new Buffer(id, 'hex'));
} else if (valid && typeof id === 'string' && id.length === 24) {
return ObjectID.createFromHexString(id);
} else if(id == null || typeof id == 'number') {
// convert to 12 byte binary string
this.id = this.generate(id);
} else if(id != null && id.length === 12) {
} else if (id != null && id.length === 12) {
// assume 12 byte string
this.id = id;
} else if(id != null && id.toHexString) {
} else if (id != null && id.toHexString) {
// Duck-typing to support ObjectId from different npm packages
return id;
} else {
throw new Error("Argument passed in must be a single String of 12 bytes or a string of 24 hex characters");
throw new Error(
'Argument passed in must be a single String of 12 bytes or a string of 24 hex characters'
);
}
if(ObjectID.cacheHexString) this.__id = this.toHexString();
if (ObjectID.cacheHexString) this.__id = this.toString('hex');
};
// Allow usage of ObjectId as well as ObjectID
var ObjectId = ObjectID;
// var ObjectId = ObjectID;
// Precomputed hex table enables speedy hex string conversion
var hexTable = [];
@ -66,16 +92,20 @@ for (var i = 0; i < 256; i++) {
* @return {string} return the 24 byte hex string representation.
*/
ObjectID.prototype.toHexString = function() {
if(ObjectID.cacheHexString && this.__id) return this.__id;
if (ObjectID.cacheHexString && this.__id) return this.__id;
var hexString = '';
if(!this.id || !this.id.length) {
throw new Error('invalid ObjectId, ObjectId.id must be either a string or a Buffer, but is [' + JSON.stringify(this.id) + ']');
if (!this.id || !this.id.length) {
throw new Error(
'invalid ObjectId, ObjectId.id must be either a string or a Buffer, but is [' +
JSON.stringify(this.id) +
']'
);
}
if(this.id instanceof _Buffer) {
if (this.id instanceof _Buffer) {
hexString = convertToHex(this.id);
if(ObjectID.cacheHexString) this.__id = hexString;
if (ObjectID.cacheHexString) this.__id = hexString;
return hexString;
}
@ -83,7 +113,7 @@ ObjectID.prototype.toHexString = function() {
hexString += hexTable[this.id.charCodeAt(i)];
}
if(ObjectID.cacheHexString) this.__id = hexString;
if (ObjectID.cacheHexString) this.__id = hexString;
return hexString;
};
@ -95,7 +125,7 @@ ObjectID.prototype.toHexString = function() {
* @ignore
*/
ObjectID.prototype.get_inc = function() {
return ObjectID.index = (ObjectID.index + 1) % 0xFFFFFF;
return (ObjectID.index = (ObjectID.index + 1) % 0xffffff);
};
/**
@ -117,12 +147,15 @@ ObjectID.prototype.getInc = function() {
* @return {Buffer} return the 12 byte id buffer string.
*/
ObjectID.prototype.generate = function(time) {
if ('number' != typeof time) {
time = ~~(Date.now()/1000);
if ('number' !== typeof time) {
time = ~~(Date.now() / 1000);
}
// Use pid
var pid = (typeof process === 'undefined' ? Math.floor(Math.random() * 100000) : process.pid) % 0xFFFF;
var pid =
(typeof process === 'undefined' || process.pid === 1
? Math.floor(Math.random() * 100000)
: process.pid) % 0xffff;
var inc = this.get_inc();
// Buffer used
var buffer = new Buffer(12);
@ -149,10 +182,17 @@ ObjectID.prototype.generate = function(time) {
/**
* Converts the id into a 24 byte hex string for printing
*
* @param {String} format The Buffer toString format parameter.
* @return {String} return the 24 byte hex string representation.
* @ignore
*/
ObjectID.prototype.toString = function() {
ObjectID.prototype.toString = function(format) {
// Is the id a buffer then use the buffer toString method to return the format
if (this.id && this.id.copy) {
return this.id.toString(typeof format === 'string' ? format : 'hex');
}
// if(this.buffer )
return this.toHexString();
};
@ -162,7 +202,7 @@ ObjectID.prototype.toString = function() {
* @return {String} return the 24 byte hex string representation.
* @ignore
*/
ObjectID.prototype.inspect = ObjectID.prototype.toString;
ObjectID.prototype[inspect] = ObjectID.prototype.toString;
/**
* Converts to its JSON representation.
@ -181,23 +221,28 @@ ObjectID.prototype.toJSON = function() {
* @param {object} otherID ObjectID instance to compare against.
* @return {boolean} the result of comparing two ObjectID's
*/
ObjectID.prototype.equals = function equals (otherId) {
var id;
ObjectID.prototype.equals = function equals(otherId) {
// var id;
if(otherId instanceof ObjectID) {
return this.toString() == otherId.toString();
} else if(typeof otherId == 'string' && ObjectID.isValid(otherId) && otherId.length == 12 && this.id instanceof _Buffer) {
if (otherId instanceof ObjectID) {
return this.toString() === otherId.toString();
} else if (
typeof otherId === 'string' &&
ObjectID.isValid(otherId) &&
otherId.length === 12 &&
this.id instanceof _Buffer
) {
return otherId === this.id.toString('binary');
} else if(typeof otherId == 'string' && ObjectID.isValid(otherId) && otherId.length == 24) {
return otherId === this.toHexString();
} else if(typeof otherId == 'string' && ObjectID.isValid(otherId) && otherId.length == 12) {
} else if (typeof otherId === 'string' && ObjectID.isValid(otherId) && otherId.length === 24) {
return otherId.toLowerCase() === this.toHexString();
} else if (typeof otherId === 'string' && ObjectID.isValid(otherId) && otherId.length === 12) {
return otherId === this.id;
} else if(otherId != null && (otherId instanceof ObjectID || otherId.toHexString)) {
} else if (otherId != null && (otherId instanceof ObjectID || otherId.toHexString)) {
return otherId.toHexString() === this.toHexString();
} else {
return false;
}
}
};
/**
* Returns the generation date (accurate up to the second) that this ID was generated.
@ -207,20 +252,20 @@ ObjectID.prototype.equals = function equals (otherId) {
*/
ObjectID.prototype.getTimestamp = function() {
var timestamp = new Date();
var time = this.id[3] | this.id[2] << 8 | this.id[1] << 16 | this.id[0] << 24;
var time = this.id[3] | (this.id[2] << 8) | (this.id[1] << 16) | (this.id[0] << 24);
timestamp.setTime(Math.floor(time) * 1000);
return timestamp;
}
};
/**
* @ignore
*/
ObjectID.index = ~~(Math.random() * 0xFFFFFF);
ObjectID.index = ~~(Math.random() * 0xffffff);
/**
* @ignore
*/
ObjectID.createPk = function createPk () {
ObjectID.createPk = function createPk() {
return new ObjectID();
};
@ -231,7 +276,7 @@ ObjectID.createPk = function createPk () {
* @param {number} time an integer number representing a number of seconds.
* @return {ObjectID} return the created ObjectID
*/
ObjectID.createFromTime = function createFromTime (time) {
ObjectID.createFromTime = function createFromTime(time) {
var buffer = new Buffer([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]);
// Encode time into first 4 bytes
buffer[3] = time & 0xff;
@ -243,16 +288,16 @@ ObjectID.createFromTime = function createFromTime (time) {
};
// Lookup tables
var encodeLookup = '0123456789abcdef'.split('')
var decodeLookup = []
var i = 0
while (i < 10) decodeLookup[0x30 + i] = i++
while (i < 16) decodeLookup[0x41 - 10 + i] = decodeLookup[0x61 - 10 + i] = i++
//var encodeLookup = '0123456789abcdef'.split('');
var decodeLookup = [];
i = 0;
while (i < 10) decodeLookup[0x30 + i] = i++;
while (i < 16) decodeLookup[0x41 - 10 + i] = decodeLookup[0x61 - 10 + i] = i++;
var _Buffer = Buffer;
var convertToHex = function(bytes) {
return bytes.toString('hex');
}
};
/**
* Creates an ObjectID from a hex string representation of an ObjectID.
@ -261,25 +306,24 @@ var convertToHex = function(bytes) {
* @param {string} hexString create a ObjectID from a passed in 24 byte hexstring.
* @return {ObjectID} return the created ObjectID
*/
ObjectID.createFromHexString = function createFromHexString (string) {
ObjectID.createFromHexString = function createFromHexString(string) {
// Throw an error if it's not a valid setup
if(typeof string === 'undefined' || string != null && string.length != 24)
throw new Error("Argument passed in must be a single String of 12 bytes or a string of 24 hex characters");
var length = string.length;
if(length > 12*2) {
throw new Error('Id cannot be longer than 12 bytes');
if (typeof string === 'undefined' || (string != null && string.length !== 24)) {
throw new Error(
'Argument passed in must be a single String of 12 bytes or a string of 24 hex characters'
);
}
// Use Buffer.from method if available
if (hasBufferType) return new ObjectID(new Buffer(string, 'hex'));
// Calculate lengths
var sizeof = length >> 1;
var array = new _Buffer(sizeof);
var array = new _Buffer(12);
var n = 0;
var i = 0;
while (i < length) {
array[n++] = decodeLookup[string.charCodeAt(i++)] << 4 | decodeLookup[string.charCodeAt(i++)]
while (i < 24) {
array[n++] = (decodeLookup[string.charCodeAt(i++)] << 4) | decodeLookup[string.charCodeAt(i++)];
}
return new ObjectID(array);
@ -292,27 +336,27 @@ ObjectID.createFromHexString = function createFromHexString (string) {
* @return {boolean} return true if the value is a valid bson ObjectId, return false otherwise.
*/
ObjectID.isValid = function isValid(id) {
if(id == null) return false;
if (id == null) return false;
if(typeof id == 'number') {
if (typeof id === 'number') {
return true;
}
if(typeof id == 'string') {
return id.length == 12 || (id.length == 24 && checkForHexRegExp.test(id));
if (typeof id === 'string') {
return id.length === 12 || (id.length === 24 && checkForHexRegExp.test(id));
}
if(id instanceof ObjectID) {
if (id instanceof ObjectID) {
return true;
}
if(id instanceof _Buffer) {
if (id instanceof _Buffer) {
return true;
}
// Duck-Typing detection of ObjectId like objects
if(id.toHexString) {
return id.id.length == 12 || (id.id.length == 24 && checkForHexRegExp.test(id.id));
if (id.toHexString) {
return id.id.length === 12 || (id.id.length === 24 && checkForHexRegExp.test(id.id));
}
return false;
@ -321,18 +365,18 @@ ObjectID.isValid = function isValid(id) {
/**
* @ignore
*/
Object.defineProperty(ObjectID.prototype, "generationTime", {
enumerable: true
, get: function () {
return this.id[3] | this.id[2] << 8 | this.id[1] << 16 | this.id[0] << 24;
}
, set: function (value) {
// Encode time into first 4 bytes
this.id[3] = value & 0xff;
this.id[2] = (value >> 8) & 0xff;
this.id[1] = (value >> 16) & 0xff;
this.id[0] = (value >> 24) & 0xff;
}
Object.defineProperty(ObjectID.prototype, 'generationTime', {
enumerable: true,
get: function() {
return this.id[3] | (this.id[2] << 8) | (this.id[1] << 16) | (this.id[0] << 24);
},
set: function(value) {
// Encode time into first 4 bytes
this.id[3] = value & 0xff;
this.id[2] = (value >> 8) & 0xff;
this.id[1] = (value >> 16) & 0xff;
this.id[0] = (value >> 24) & 0xff;
}
});
/**

View file

@ -1,137 +1,240 @@
"use strict"
'use strict';
var writeIEEE754 = require('../float_parser').writeIEEE754
, readIEEE754 = require('../float_parser').readIEEE754
, Long = require('../long').Long
, Double = require('../double').Double
, Timestamp = require('../timestamp').Timestamp
, ObjectID = require('../objectid').ObjectID
, Symbol = require('../symbol').Symbol
, BSONRegExp = require('../regexp').BSONRegExp
, Code = require('../code').Code
, Decimal128 = require('../decimal128')
, MinKey = require('../min_key').MinKey
, MaxKey = require('../max_key').MaxKey
, DBRef = require('../db_ref').DBRef
, Binary = require('../binary').Binary;
var Long = require('../long').Long,
Double = require('../double').Double,
Timestamp = require('../timestamp').Timestamp,
ObjectID = require('../objectid').ObjectID,
Symbol = require('../symbol').Symbol,
BSONRegExp = require('../regexp').BSONRegExp,
Code = require('../code').Code,
Decimal128 = require('../decimal128'),
MinKey = require('../min_key').MinKey,
MaxKey = require('../max_key').MaxKey,
DBRef = require('../db_ref').DBRef,
Binary = require('../binary').Binary;
var normalizedFunctionString = require('./utils').normalizedFunctionString;
// To ensure that 0.4 of node works correctly
var isDate = function isDate(d) {
return typeof d === 'object' && Object.prototype.toString.call(d) === '[object Date]';
}
};
var calculateObjectSize = function calculateObjectSize(object, serializeFunctions, ignoreUndefined) {
var totalLength = (4 + 1);
var calculateObjectSize = function calculateObjectSize(
object,
serializeFunctions,
ignoreUndefined
) {
var totalLength = 4 + 1;
if(Array.isArray(object)) {
for(var i = 0; i < object.length; i++) {
totalLength += calculateElement(i.toString(), object[i], serializeFunctions, true, ignoreUndefined)
if (Array.isArray(object)) {
for (var i = 0; i < object.length; i++) {
totalLength += calculateElement(
i.toString(),
object[i],
serializeFunctions,
true,
ignoreUndefined
);
}
} else {
// If we have toBSON defined, override the current object
if(object.toBSON) {
object = object.toBSON();
}
// If we have toBSON defined, override the current object
if (object.toBSON) {
object = object.toBSON();
}
// Calculate size
for(var key in object) {
totalLength += calculateElement(key, object[key], serializeFunctions, false, ignoreUndefined)
// Calculate size
for (var key in object) {
totalLength += calculateElement(key, object[key], serializeFunctions, false, ignoreUndefined);
}
}
return totalLength;
}
};
/**
* @ignore
* @api private
*/
function calculateElement(name, value, serializeFunctions, isArray, ignoreUndefined) {
// If we have toBSON defined, override the current object
if(value && value.toBSON){
// If we have toBSON defined, override the current object
if (value && value.toBSON) {
value = value.toBSON();
}
switch(typeof value) {
switch (typeof value) {
case 'string':
return 1 + Buffer.byteLength(name, 'utf8') + 1 + 4 + Buffer.byteLength(value, 'utf8') + 1;
case 'number':
if(Math.floor(value) === value && value >= BSON.JS_INT_MIN && value <= BSON.JS_INT_MAX) {
if(value >= BSON.BSON_INT32_MIN && value <= BSON.BSON_INT32_MAX) { // 32 bit
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (4 + 1);
if (Math.floor(value) === value && value >= BSON.JS_INT_MIN && value <= BSON.JS_INT_MAX) {
if (value >= BSON.BSON_INT32_MIN && value <= BSON.BSON_INT32_MAX) {
// 32 bit
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (4 + 1);
} else {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (8 + 1);
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (8 + 1);
}
} else { // 64 bit
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (8 + 1);
} else {
// 64 bit
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (8 + 1);
}
case 'undefined':
if(isArray || !ignoreUndefined) return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (1);
if (isArray || !ignoreUndefined)
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + 1;
return 0;
case 'boolean':
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (1 + 1);
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (1 + 1);
case 'object':
if(value == null || value instanceof MinKey || value instanceof MaxKey || value['_bsontype'] == 'MinKey' || value['_bsontype'] == 'MaxKey') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (1);
} else if(value instanceof ObjectID || value['_bsontype'] == 'ObjectID') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (12 + 1);
} else if(value instanceof Date || isDate(value)) {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (8 + 1);
} else if(typeof Buffer !== 'undefined' && Buffer.isBuffer(value)) {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (1 + 4 + 1) + value.length;
} else if(value instanceof Long || value instanceof Double || value instanceof Timestamp
|| value['_bsontype'] == 'Long' || value['_bsontype'] == 'Double' || value['_bsontype'] == 'Timestamp') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (8 + 1);
} else if(value instanceof Decimal128 || value['_bsontype'] == 'Decimal128') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (16 + 1);
} else if(value instanceof Code || value['_bsontype'] == 'Code') {
if (
value == null ||
value instanceof MinKey ||
value instanceof MaxKey ||
value['_bsontype'] === 'MinKey' ||
value['_bsontype'] === 'MaxKey'
) {
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + 1;
} else if (value instanceof ObjectID || value['_bsontype'] === 'ObjectID') {
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (12 + 1);
} else if (value instanceof Date || isDate(value)) {
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (8 + 1);
} else if (typeof Buffer !== 'undefined' && Buffer.isBuffer(value)) {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (1 + 4 + 1) + value.length
);
} else if (
value instanceof Long ||
value instanceof Double ||
value instanceof Timestamp ||
value['_bsontype'] === 'Long' ||
value['_bsontype'] === 'Double' ||
value['_bsontype'] === 'Timestamp'
) {
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (8 + 1);
} else if (value instanceof Decimal128 || value['_bsontype'] === 'Decimal128') {
return (name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (16 + 1);
} else if (value instanceof Code || value['_bsontype'] === 'Code') {
// Calculate size depending on the availability of a scope
if(value.scope != null && Object.keys(value.scope).length > 0) {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + 4 + 4 + Buffer.byteLength(value.code.toString(), 'utf8') + 1 + calculateObjectSize(value.scope, serializeFunctions, ignoreUndefined);
if (value.scope != null && Object.keys(value.scope).length > 0) {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
4 +
4 +
Buffer.byteLength(value.code.toString(), 'utf8') +
1 +
calculateObjectSize(value.scope, serializeFunctions, ignoreUndefined)
);
} else {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + 4 + Buffer.byteLength(value.code.toString(), 'utf8') + 1;
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
4 +
Buffer.byteLength(value.code.toString(), 'utf8') +
1
);
}
} else if(value instanceof Binary || value['_bsontype'] == 'Binary') {
} else if (value instanceof Binary || value['_bsontype'] === 'Binary') {
// Check what kind of subtype we have
if(value.sub_type == Binary.SUBTYPE_BYTE_ARRAY) {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (value.position + 1 + 4 + 1 + 4);
if (value.sub_type === Binary.SUBTYPE_BYTE_ARRAY) {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
(value.position + 1 + 4 + 1 + 4)
);
} else {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + (value.position + 1 + 4 + 1);
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) + (value.position + 1 + 4 + 1)
);
}
} else if(value instanceof Symbol || value['_bsontype'] == 'Symbol') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + Buffer.byteLength(value.value, 'utf8') + 4 + 1 + 1;
} else if(value instanceof DBRef || value['_bsontype'] == 'DBRef') {
} else if (value instanceof Symbol || value['_bsontype'] === 'Symbol') {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
Buffer.byteLength(value.value, 'utf8') +
4 +
1 +
1
);
} else if (value instanceof DBRef || value['_bsontype'] === 'DBRef') {
// Set up correct object for serialization
var ordered_values = {
'$ref': value.namespace
, '$id' : value.oid
$ref: value.namespace,
$id: value.oid
};
// Add db reference if it exists
if(null != value.db) {
if (null != value.db) {
ordered_values['$db'] = value.db;
}
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + calculateObjectSize(ordered_values, serializeFunctions, ignoreUndefined);
} else if(value instanceof RegExp || Object.prototype.toString.call(value) === '[object RegExp]') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + Buffer.byteLength(value.source, 'utf8') + 1
+ (value.global ? 1 : 0) + (value.ignoreCase ? 1 : 0) + (value.multiline ? 1 : 0) + 1
} else if(value instanceof BSONRegExp || value['_bsontype'] == 'BSONRegExp') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + Buffer.byteLength(value.pattern, 'utf8') + 1
+ Buffer.byteLength(value.options, 'utf8') + 1
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
calculateObjectSize(ordered_values, serializeFunctions, ignoreUndefined)
);
} else if (
value instanceof RegExp ||
Object.prototype.toString.call(value) === '[object RegExp]'
) {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
Buffer.byteLength(value.source, 'utf8') +
1 +
(value.global ? 1 : 0) +
(value.ignoreCase ? 1 : 0) +
(value.multiline ? 1 : 0) +
1
);
} else if (value instanceof BSONRegExp || value['_bsontype'] === 'BSONRegExp') {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
Buffer.byteLength(value.pattern, 'utf8') +
1 +
Buffer.byteLength(value.options, 'utf8') +
1
);
} else {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + calculateObjectSize(value, serializeFunctions, ignoreUndefined) + 1;
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
calculateObjectSize(value, serializeFunctions, ignoreUndefined) +
1
);
}
case 'function':
// WTF for 0.4.X where typeof /someregexp/ === 'function'
if(value instanceof RegExp || Object.prototype.toString.call(value) === '[object RegExp]' || String.call(value) == '[object RegExp]') {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + Buffer.byteLength(value.source, 'utf8') + 1
+ (value.global ? 1 : 0) + (value.ignoreCase ? 1 : 0) + (value.multiline ? 1 : 0) + 1
if (
value instanceof RegExp ||
Object.prototype.toString.call(value) === '[object RegExp]' ||
String.call(value) === '[object RegExp]'
) {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
Buffer.byteLength(value.source, 'utf8') +
1 +
(value.global ? 1 : 0) +
(value.ignoreCase ? 1 : 0) +
(value.multiline ? 1 : 0) +
1
);
} else {
if(serializeFunctions && value.scope != null && Object.keys(value.scope).length > 0) {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + 4 + 4 + Buffer.byteLength(value.toString(), 'utf8') + 1 + calculateObjectSize(value.scope, serializeFunctions, ignoreUndefined);
} else if(serializeFunctions) {
return (name != null ? (Buffer.byteLength(name, 'utf8') + 1) : 0) + 1 + 4 + Buffer.byteLength(value.toString(), 'utf8') + 1;
if (serializeFunctions && value.scope != null && Object.keys(value.scope).length > 0) {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
4 +
4 +
Buffer.byteLength(normalizedFunctionString(value), 'utf8') +
1 +
calculateObjectSize(value.scope, serializeFunctions, ignoreUndefined)
);
} else if (serializeFunctions) {
return (
(name != null ? Buffer.byteLength(name, 'utf8') + 1 : 0) +
1 +
4 +
Buffer.byteLength(normalizedFunctionString(value), 'utf8') +
1
);
}
}
}
@ -142,11 +245,11 @@ function calculateElement(name, value, serializeFunctions, isArray, ignoreUndefi
var BSON = {};
// BSON MAX VALUES
BSON.BSON_INT32_MAX = 0x7FFFFFFF;
BSON.BSON_INT32_MAX = 0x7fffffff;
BSON.BSON_INT32_MIN = -0x80000000;
// JS MAX PRECISE VALUES
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
module.exports = calculateObjectSize;

View file

@ -1,8 +1,6 @@
"use strict"
'use strict';
var readIEEE754 = require('../float_parser').readIEEE754,
f = require('util').format,
Long = require('../long').Long,
var Long = require('../long').Long,
Double = require('../double').Double,
Timestamp = require('../timestamp').Timestamp,
ObjectID = require('../objectid').ObjectID,
@ -10,216 +8,290 @@ var readIEEE754 = require('../float_parser').readIEEE754,
Code = require('../code').Code,
MinKey = require('../min_key').MinKey,
MaxKey = require('../max_key').MaxKey,
Decimal128 = require('../decimal128'),
Int32 = require('../int_32'),
Decimal128 = require('../decimal128'),
Int32 = require('../int_32'),
DBRef = require('../db_ref').DBRef,
BSONRegExp = require('../regexp').BSONRegExp,
Binary = require('../binary').Binary;
var deserialize = function(buffer, options, isArray) {
options = options == null ? {} : options;
var index = options && options.index ? options.index : 0;
// Read the document size
var size = buffer[index] | buffer[index+1] << 8 | buffer[index+2] << 16 | buffer[index+3] << 24;
options = options == null ? {} : options;
var index = options && options.index ? options.index : 0;
// Read the document size
var size =
buffer[index] |
(buffer[index + 1] << 8) |
(buffer[index + 2] << 16) |
(buffer[index + 3] << 24);
// Ensure buffer is valid size
if(size < 5 || buffer.length < size || (size + index) > buffer.length) {
throw new Error("corrupt bson message");
}
// Ensure buffer is valid size
if (size < 5 || buffer.length < size || size + index > buffer.length) {
throw new Error('corrupt bson message');
}
// Illegal end value
if(buffer[index + size - 1] != 0) {
throw new Error("One object, sized correctly, with a spot for an EOO, but the EOO isn't 0x00");
}
// Illegal end value
if (buffer[index + size - 1] !== 0) {
throw new Error("One object, sized correctly, with a spot for an EOO, but the EOO isn't 0x00");
}
// Start deserializtion
return deserializeObject(buffer, index, options, isArray);
}
// Start deserializtion
return deserializeObject(buffer, index, options, isArray);
};
var deserializeObject = function(buffer, index, options, isArray) {
var evalFunctions = options['evalFunctions'] == null ? false : options['evalFunctions'];
var evalFunctions = options['evalFunctions'] == null ? false : options['evalFunctions'];
var cacheFunctions = options['cacheFunctions'] == null ? false : options['cacheFunctions'];
var cacheFunctionsCrc32 = options['cacheFunctionsCrc32'] == null ? false : options['cacheFunctionsCrc32'];
var fieldsAsRaw = options['fieldsAsRaw'] == null ? null : options['fieldsAsRaw'];
var cacheFunctionsCrc32 =
options['cacheFunctionsCrc32'] == null ? false : options['cacheFunctionsCrc32'];
// Return raw bson buffer instead of parsing it
var raw = options['raw'] == null ? false : options['raw'];
if (!cacheFunctionsCrc32) var crc32 = null;
// Return BSONRegExp objects instead of native regular expressions
var bsonRegExp = typeof options['bsonRegExp'] == 'boolean' ? options['bsonRegExp'] : false;
var fieldsAsRaw = options['fieldsAsRaw'] == null ? null : options['fieldsAsRaw'];
// Controls the promotion of values vs wrapper classes
var promoteBuffers = options['promoteBuffers'] == null ? false : options['promoteBuffers'];
var promoteLongs = options['promoteLongs'] == null ? true : options['promoteLongs'];
var promoteValues = options['promoteValues'] == null ? true : options['promoteValues'];
// Return raw bson buffer instead of parsing it
var raw = options['raw'] == null ? false : options['raw'];
// Set the start index
var startIndex = index;
// Return BSONRegExp objects instead of native regular expressions
var bsonRegExp = typeof options['bsonRegExp'] === 'boolean' ? options['bsonRegExp'] : false;
// Controls the promotion of values vs wrapper classes
var promoteBuffers = options['promoteBuffers'] == null ? false : options['promoteBuffers'];
var promoteLongs = options['promoteLongs'] == null ? true : options['promoteLongs'];
var promoteValues = options['promoteValues'] == null ? true : options['promoteValues'];
// Set the start index
var startIndex = index;
// Validate that we have at least 4 bytes of buffer
if(buffer.length < 5) throw new Error("corrupt bson message < 5 bytes long");
if (buffer.length < 5) throw new Error('corrupt bson message < 5 bytes long');
// Read the document size
var size = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
// Read the document size
var size =
buffer[index++] | (buffer[index++] << 8) | (buffer[index++] << 16) | (buffer[index++] << 24);
// Ensure buffer is valid size
if(size < 5 || size > buffer.length) throw new Error("corrupt bson message");
// Ensure buffer is valid size
if (size < 5 || size > buffer.length) throw new Error('corrupt bson message');
// Create holding object
var object = isArray ? [] : {};
// Used for arrays to skip having to perform utf8 decoding
var arrayIndex = 0;
// Used for arrays to skip having to perform utf8 decoding
var arrayIndex = 0;
var done = false;
// While we have more left data left keep parsing
while(true) {
// while (buffer[index + 1] !== 0) {
while (!done) {
// Read the type
var elementType = buffer[index++];
// If we get a zero it's the last byte, exit
if(elementType == 0) {
break;
}
if (elementType === 0) break;
// Get the start search index
var i = index;
// Locate the end of the c string
while(buffer[i] !== 0x00 && i < buffer.length) {
i++
}
// Get the start search index
var i = index;
// Locate the end of the c string
while (buffer[i] !== 0x00 && i < buffer.length) {
i++;
}
// If are at the end of the buffer there is a problem with the document
if(i >= buffer.length) throw new Error("Bad BSON Document: illegal CString")
var name = isArray ? arrayIndex++ : buffer.toString('utf8', index, i);
// If are at the end of the buffer there is a problem with the document
if (i >= buffer.length) throw new Error('Bad BSON Document: illegal CString');
var name = isArray ? arrayIndex++ : buffer.toString('utf8', index, i);
index = i + 1;
index = i + 1;
if(elementType == BSON.BSON_DATA_STRING) {
var stringSize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
if(stringSize <= 0 || stringSize > (buffer.length - index) || buffer[index + stringSize - 1] != 0) throw new Error("bad string length in bson");
if (elementType === BSON.BSON_DATA_STRING) {
var stringSize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
if (
stringSize <= 0 ||
stringSize > buffer.length - index ||
buffer[index + stringSize - 1] !== 0
)
throw new Error('bad string length in bson');
object[name] = buffer.toString('utf8', index, index + stringSize - 1);
index = index + stringSize;
} else if(elementType == BSON.BSON_DATA_OID) {
var oid = new Buffer(12);
buffer.copy(oid, 0, index, index + 12);
} else if (elementType === BSON.BSON_DATA_OID) {
var oid = new Buffer(12);
buffer.copy(oid, 0, index, index + 12);
object[name] = new ObjectID(oid);
index = index + 12;
} else if(elementType == BSON.BSON_DATA_INT && promoteValues == false) {
object[name] = new Int32(buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24);
} else if(elementType == BSON.BSON_DATA_INT) {
object[name] = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
} else if(elementType == BSON.BSON_DATA_NUMBER && promoteValues == false) {
object[name] = new Double(buffer.readDoubleLE(index));
index = index + 8;
} else if(elementType == BSON.BSON_DATA_NUMBER) {
object[name] = buffer.readDoubleLE(index);
} else if (elementType === BSON.BSON_DATA_INT && promoteValues === false) {
object[name] = new Int32(
buffer[index++] | (buffer[index++] << 8) | (buffer[index++] << 16) | (buffer[index++] << 24)
);
} else if (elementType === BSON.BSON_DATA_INT) {
object[name] =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
} else if (elementType === BSON.BSON_DATA_NUMBER && promoteValues === false) {
object[name] = new Double(buffer.readDoubleLE(index));
index = index + 8;
} else if(elementType == BSON.BSON_DATA_DATE) {
var lowBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
var highBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
} else if (elementType === BSON.BSON_DATA_NUMBER) {
object[name] = buffer.readDoubleLE(index);
index = index + 8;
} else if (elementType === BSON.BSON_DATA_DATE) {
var lowBits =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
var highBits =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
object[name] = new Date(new Long(lowBits, highBits).toNumber());
} else if(elementType == BSON.BSON_DATA_BOOLEAN) {
if(buffer[index] != 0 && buffer[index] != 1) throw new Error('illegal boolean type value');
object[name] = buffer[index++] == 1;
} else if(elementType == BSON.BSON_DATA_OBJECT) {
var _index = index;
var objectSize = buffer[index] | buffer[index + 1] << 8 | buffer[index + 2] << 16 | buffer[index + 3] << 24;
if(objectSize <= 0 || objectSize > (buffer.length - index)) throw new Error("bad embedded document length in bson");
} else if (elementType === BSON.BSON_DATA_BOOLEAN) {
if (buffer[index] !== 0 && buffer[index] !== 1) throw new Error('illegal boolean type value');
object[name] = buffer[index++] === 1;
} else if (elementType === BSON.BSON_DATA_OBJECT) {
var _index = index;
var objectSize =
buffer[index] |
(buffer[index + 1] << 8) |
(buffer[index + 2] << 16) |
(buffer[index + 3] << 24);
if (objectSize <= 0 || objectSize > buffer.length - index)
throw new Error('bad embedded document length in bson');
// We have a raw value
if(raw) {
object[name] = buffer.slice(index, index + objectSize);
} else {
object[name] = deserializeObject(buffer, _index, options, false);
}
// We have a raw value
if (raw) {
object[name] = buffer.slice(index, index + objectSize);
} else {
object[name] = deserializeObject(buffer, _index, options, false);
}
index = index + objectSize;
} else if(elementType == BSON.BSON_DATA_ARRAY) {
var _index = index;
var objectSize = buffer[index] | buffer[index + 1] << 8 | buffer[index + 2] << 16 | buffer[index + 3] << 24;
var arrayOptions = options;
} else if (elementType === BSON.BSON_DATA_ARRAY) {
_index = index;
objectSize =
buffer[index] |
(buffer[index + 1] << 8) |
(buffer[index + 2] << 16) |
(buffer[index + 3] << 24);
var arrayOptions = options;
// Stop index
var stopIndex = index + objectSize;
// Stop index
var stopIndex = index + objectSize;
// All elements of array to be returned as raw bson
if(fieldsAsRaw && fieldsAsRaw[name]) {
arrayOptions = {};
for(var n in options) arrayOptions[n] = options[n];
arrayOptions['raw'] = true;
}
// All elements of array to be returned as raw bson
if (fieldsAsRaw && fieldsAsRaw[name]) {
arrayOptions = {};
for (var n in options) arrayOptions[n] = options[n];
arrayOptions['raw'] = true;
}
object[name] = deserializeObject(buffer, _index, arrayOptions, true);
index = index + objectSize;
if(buffer[index - 1] != 0) throw new Error('invalid array terminator byte');
if(index != stopIndex) throw new Error('corrupted array bson');
} else if(elementType == BSON.BSON_DATA_UNDEFINED) {
if (buffer[index - 1] !== 0) throw new Error('invalid array terminator byte');
if (index !== stopIndex) throw new Error('corrupted array bson');
} else if (elementType === BSON.BSON_DATA_UNDEFINED) {
object[name] = undefined;
} else if(elementType == BSON.BSON_DATA_NULL) {
object[name] = null;
} else if(elementType == BSON.BSON_DATA_LONG) {
} else if (elementType === BSON.BSON_DATA_NULL) {
object[name] = null;
} else if (elementType === BSON.BSON_DATA_LONG) {
// Unpack the low and high bits
var lowBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
var highBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
lowBits =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
highBits =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
var long = new Long(lowBits, highBits);
// Promote the long if possible
if(promoteLongs && promoteValues == true) {
object[name] = long.lessThanOrEqual(JS_INT_MAX_LONG) && long.greaterThanOrEqual(JS_INT_MIN_LONG) ? long.toNumber() : long;
if (promoteLongs && promoteValues === true) {
object[name] =
long.lessThanOrEqual(JS_INT_MAX_LONG) && long.greaterThanOrEqual(JS_INT_MIN_LONG)
? long.toNumber()
: long;
} else {
object[name] = long;
}
} else if(elementType == BSON.BSON_DATA_DECIMAL128) {
// Buffer to contain the decimal bytes
var bytes = new Buffer(16);
// Copy the next 16 bytes into the bytes buffer
buffer.copy(bytes, 0, index, index + 16);
// Update index
index = index + 16;
// Assign the new Decimal128 value
var decimal128 = new Decimal128(bytes);
// If we have an alternative mapper use that
object[name] = decimal128.toObject ? decimal128.toObject() : decimal128;
} else if(elementType == BSON.BSON_DATA_BINARY) {
var binarySize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
var totalBinarySize = binarySize;
} else if (elementType === BSON.BSON_DATA_DECIMAL128) {
// Buffer to contain the decimal bytes
var bytes = new Buffer(16);
// Copy the next 16 bytes into the bytes buffer
buffer.copy(bytes, 0, index, index + 16);
// Update index
index = index + 16;
// Assign the new Decimal128 value
var decimal128 = new Decimal128(bytes);
// If we have an alternative mapper use that
object[name] = decimal128.toObject ? decimal128.toObject() : decimal128;
} else if (elementType === BSON.BSON_DATA_BINARY) {
var binarySize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
var totalBinarySize = binarySize;
var subType = buffer[index++];
// Did we have a negative binary size, throw
if(binarySize < 0) throw new Error('Negative binary type element size found');
// Did we have a negative binary size, throw
if (binarySize < 0) throw new Error('Negative binary type element size found');
// Is the length longer than the document
if(binarySize > buffer.length) throw new Error('Binary type size larger than document size');
// Is the length longer than the document
if (binarySize > buffer.length) throw new Error('Binary type size larger than document size');
// Decode as raw Buffer object if options specifies it
if(buffer['slice'] != null) {
// Decode as raw Buffer object if options specifies it
if (buffer['slice'] != null) {
// If we have subtype 2 skip the 4 bytes for the size
if(subType == Binary.SUBTYPE_BYTE_ARRAY) {
binarySize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
if(binarySize < 0) throw new Error('Negative binary type element size found for subtype 0x02');
if(binarySize > (totalBinarySize - 4)) throw new Error('Binary type with subtype 0x02 contains to long binary size');
if(binarySize < (totalBinarySize - 4)) throw new Error('Binary type with subtype 0x02 contains to short binary size');
if (subType === Binary.SUBTYPE_BYTE_ARRAY) {
binarySize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
if (binarySize < 0)
throw new Error('Negative binary type element size found for subtype 0x02');
if (binarySize > totalBinarySize - 4)
throw new Error('Binary type with subtype 0x02 contains to long binary size');
if (binarySize < totalBinarySize - 4)
throw new Error('Binary type with subtype 0x02 contains to short binary size');
}
if(promoteBuffers && promoteValues) {
if (promoteBuffers && promoteValues) {
object[name] = buffer.slice(index, index + binarySize);
} else {
object[name] = new Binary(buffer.slice(index, index + binarySize), subType);
}
} else {
var _buffer = typeof Uint8Array != 'undefined' ? new Uint8Array(new ArrayBuffer(binarySize)) : new Array(binarySize);
var _buffer =
typeof Uint8Array !== 'undefined'
? new Uint8Array(new ArrayBuffer(binarySize))
: new Array(binarySize);
// If we have subtype 2 skip the 4 bytes for the size
if(subType == Binary.SUBTYPE_BYTE_ARRAY) {
binarySize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
if(binarySize < 0) throw new Error('Negative binary type element size found for subtype 0x02');
if(binarySize > (totalBinarySize - 4)) throw new Error('Binary type with subtype 0x02 contains to long binary size');
if(binarySize < (totalBinarySize - 4)) throw new Error('Binary type with subtype 0x02 contains to short binary size');
if (subType === Binary.SUBTYPE_BYTE_ARRAY) {
binarySize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
if (binarySize < 0)
throw new Error('Negative binary type element size found for subtype 0x02');
if (binarySize > totalBinarySize - 4)
throw new Error('Binary type with subtype 0x02 contains to long binary size');
if (binarySize < totalBinarySize - 4)
throw new Error('Binary type with subtype 0x02 contains to short binary size');
}
// Copy the data
for(var i = 0; i < binarySize; i++) {
for (i = 0; i < binarySize; i++) {
_buffer[i] = buffer[index + i];
}
if(promoteBuffers && promoteValues) {
if (promoteBuffers && promoteValues) {
object[name] = _buffer;
} else {
object[name] = new Binary(_buffer, subType);
@ -228,38 +300,38 @@ var deserializeObject = function(buffer, index, options, isArray) {
// Update the index
index = index + binarySize;
} else if(elementType == BSON.BSON_DATA_REGEXP && bsonRegExp == false) {
// Get the start search index
var i = index;
// Locate the end of the c string
while(buffer[i] !== 0x00 && i < buffer.length) {
i++
}
// If are at the end of the buffer there is a problem with the document
if(i >= buffer.length) throw new Error("Bad BSON Document: illegal CString")
// Return the C string
var source = buffer.toString('utf8', index, i);
} else if (elementType === BSON.BSON_DATA_REGEXP && bsonRegExp === false) {
// Get the start search index
i = index;
// Locate the end of the c string
while (buffer[i] !== 0x00 && i < buffer.length) {
i++;
}
// If are at the end of the buffer there is a problem with the document
if (i >= buffer.length) throw new Error('Bad BSON Document: illegal CString');
// Return the C string
var source = buffer.toString('utf8', index, i);
// Create the regexp
index = i + 1;
index = i + 1;
// Get the start search index
var i = index;
// Locate the end of the c string
while(buffer[i] !== 0x00 && i < buffer.length) {
i++
}
// If are at the end of the buffer there is a problem with the document
if(i >= buffer.length) throw new Error("Bad BSON Document: illegal CString")
// Return the C string
var regExpOptions = buffer.toString('utf8', index, i);
index = i + 1;
// Get the start search index
i = index;
// Locate the end of the c string
while (buffer[i] !== 0x00 && i < buffer.length) {
i++;
}
// If are at the end of the buffer there is a problem with the document
if (i >= buffer.length) throw new Error('Bad BSON Document: illegal CString');
// Return the C string
var regExpOptions = buffer.toString('utf8', index, i);
index = i + 1;
// For each option add the corresponding one for javascript
var optionsArray = new Array(regExpOptions.length);
// Parse options
for(var i = 0; i < regExpOptions.length; i++) {
switch(regExpOptions[i]) {
for (i = 0; i < regExpOptions.length; i++) {
switch (regExpOptions[i]) {
case 'm':
optionsArray[i] = 'm';
break;
@ -273,56 +345,81 @@ var deserializeObject = function(buffer, index, options, isArray) {
}
object[name] = new RegExp(source, optionsArray.join(''));
} else if(elementType == BSON.BSON_DATA_REGEXP && bsonRegExp == true) {
// Get the start search index
var i = index;
// Locate the end of the c string
while(buffer[i] !== 0x00 && i < buffer.length) {
i++
}
// If are at the end of the buffer there is a problem with the document
if(i >= buffer.length) throw new Error("Bad BSON Document: illegal CString")
// Return the C string
var source = buffer.toString('utf8', index, i);
} else if (elementType === BSON.BSON_DATA_REGEXP && bsonRegExp === true) {
// Get the start search index
i = index;
// Locate the end of the c string
while (buffer[i] !== 0x00 && i < buffer.length) {
i++;
}
// If are at the end of the buffer there is a problem with the document
if (i >= buffer.length) throw new Error('Bad BSON Document: illegal CString');
// Return the C string
source = buffer.toString('utf8', index, i);
index = i + 1;
// Get the start search index
var i = index;
// Locate the end of the c string
while(buffer[i] !== 0x00 && i < buffer.length) {
i++
}
// If are at the end of the buffer there is a problem with the document
if(i >= buffer.length) throw new Error("Bad BSON Document: illegal CString")
// Return the C string
var regExpOptions = buffer.toString('utf8', index, i);
// Get the start search index
i = index;
// Locate the end of the c string
while (buffer[i] !== 0x00 && i < buffer.length) {
i++;
}
// If are at the end of the buffer there is a problem with the document
if (i >= buffer.length) throw new Error('Bad BSON Document: illegal CString');
// Return the C string
regExpOptions = buffer.toString('utf8', index, i);
index = i + 1;
// Set the object
object[name] = new BSONRegExp(source, regExpOptions);
} else if(elementType == BSON.BSON_DATA_SYMBOL) {
var stringSize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
if(stringSize <= 0 || stringSize > (buffer.length - index) || buffer[index + stringSize - 1] != 0) throw new Error("bad string length in bson");
} else if (elementType === BSON.BSON_DATA_SYMBOL) {
stringSize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
if (
stringSize <= 0 ||
stringSize > buffer.length - index ||
buffer[index + stringSize - 1] !== 0
)
throw new Error('bad string length in bson');
object[name] = new Symbol(buffer.toString('utf8', index, index + stringSize - 1));
index = index + stringSize;
} else if(elementType == BSON.BSON_DATA_TIMESTAMP) {
var lowBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
var highBits = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
} else if (elementType === BSON.BSON_DATA_TIMESTAMP) {
lowBits =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
highBits =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
object[name] = new Timestamp(lowBits, highBits);
} else if(elementType == BSON.BSON_DATA_MIN_KEY) {
} else if (elementType === BSON.BSON_DATA_MIN_KEY) {
object[name] = new MinKey();
} else if(elementType == BSON.BSON_DATA_MAX_KEY) {
} else if (elementType === BSON.BSON_DATA_MAX_KEY) {
object[name] = new MaxKey();
} else if(elementType == BSON.BSON_DATA_CODE) {
var stringSize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
if(stringSize <= 0 || stringSize > (buffer.length - index) || buffer[index + stringSize - 1] != 0) throw new Error("bad string length in bson");
} else if (elementType === BSON.BSON_DATA_CODE) {
stringSize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
if (
stringSize <= 0 ||
stringSize > buffer.length - index ||
buffer[index + stringSize - 1] !== 0
)
throw new Error('bad string length in bson');
var functionString = buffer.toString('utf8', index, index + stringSize - 1);
// If we are evaluating the functions
if(evalFunctions) {
var value = null;
if (evalFunctions) {
// If we have cache enabled let's look for the md5 of the function in the cache
if(cacheFunctions) {
if (cacheFunctions) {
var hash = cacheFunctionsCrc32 ? crc32(functionString) : functionString;
// Got to do this to avoid V8 deoptimizing the call due to finding eval
object[name] = isolateEvalWithHash(functionCache, hash, functionString, object);
@ -330,54 +427,69 @@ var deserializeObject = function(buffer, index, options, isArray) {
object[name] = isolateEval(functionString);
}
} else {
object[name] = new Code(functionString);
object[name] = new Code(functionString);
}
// Update parse index position
index = index + stringSize;
} else if(elementType == BSON.BSON_DATA_CODE_W_SCOPE) {
var totalSize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
} else if (elementType === BSON.BSON_DATA_CODE_W_SCOPE) {
var totalSize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
// Element cannot be shorter than totalSize + stringSize + documentSize + terminator
if(totalSize < (4 + 4 + 4 + 1)) {
throw new Error("code_w_scope total size shorter minimum expected length");
}
// Element cannot be shorter than totalSize + stringSize + documentSize + terminator
if (totalSize < 4 + 4 + 4 + 1) {
throw new Error('code_w_scope total size shorter minimum expected length');
}
// Get the code string size
var stringSize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
// Check if we have a valid string
if(stringSize <= 0 || stringSize > (buffer.length - index) || buffer[index + stringSize - 1] != 0) throw new Error("bad string length in bson");
// Get the code string size
stringSize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
// Check if we have a valid string
if (
stringSize <= 0 ||
stringSize > buffer.length - index ||
buffer[index + stringSize - 1] !== 0
)
throw new Error('bad string length in bson');
// Javascript function
var functionString = buffer.toString('utf8', index, index + stringSize - 1);
functionString = buffer.toString('utf8', index, index + stringSize - 1);
// Update parse index position
index = index + stringSize;
// Parse the element
var _index = index;
_index = index;
// Decode the size of the object document
var objectSize = buffer[index] | buffer[index + 1] << 8 | buffer[index + 2] << 16 | buffer[index + 3] << 24;
objectSize =
buffer[index] |
(buffer[index + 1] << 8) |
(buffer[index + 2] << 16) |
(buffer[index + 3] << 24);
// Decode the scope object
var scopeObject = deserializeObject(buffer, _index, options, false);
// Adjust the index
index = index + objectSize;
// Check if field length is to short
if(totalSize < (4 + 4 + objectSize + stringSize)) {
throw new Error('code_w_scope total size is to short, truncating scope');
}
// Check if field length is to short
if (totalSize < 4 + 4 + objectSize + stringSize) {
throw new Error('code_w_scope total size is to short, truncating scope');
}
// Check if totalSize field is to long
if(totalSize > (4 + 4 + objectSize + stringSize)) {
throw new Error('code_w_scope total size is to long, clips outer document');
}
// Check if totalSize field is to long
if (totalSize > 4 + 4 + objectSize + stringSize) {
throw new Error('code_w_scope total size is to long, clips outer document');
}
// If we are evaluating the functions
if(evalFunctions) {
// Contains the value we are going to set
var value = null;
if (evalFunctions) {
// If we have cache enabled let's look for the md5 of the function in the cache
if(cacheFunctions) {
var hash = cacheFunctionsCrc32 ? crc32(functionString) : functionString;
if (cacheFunctions) {
hash = cacheFunctionsCrc32 ? crc32(functionString) : functionString;
// Got to do this to avoid V8 deoptimizing the call due to finding eval
object[name] = isolateEvalWithHash(functionCache, hash, functionString, object);
} else {
@ -386,47 +498,62 @@ var deserializeObject = function(buffer, index, options, isArray) {
object[name].scope = scopeObject;
} else {
object[name] = new Code(functionString, scopeObject);
object[name] = new Code(functionString, scopeObject);
}
} else if(elementType == BSON_DATA_DBPOINTER) {
// Get the code string size
var stringSize = buffer[index++] | buffer[index++] << 8 | buffer[index++] << 16 | buffer[index++] << 24;
// Check if we have a valid string
if(stringSize <= 0 || stringSize > (buffer.length - index) || buffer[index + stringSize - 1] != 0) throw new Error("bad string length in bson");
// Namespace
} else if (elementType === BSON.BSON_DATA_DBPOINTER) {
// Get the code string size
stringSize =
buffer[index++] |
(buffer[index++] << 8) |
(buffer[index++] << 16) |
(buffer[index++] << 24);
// Check if we have a valid string
if (
stringSize <= 0 ||
stringSize > buffer.length - index ||
buffer[index + stringSize - 1] !== 0
)
throw new Error('bad string length in bson');
// Namespace
var namespace = buffer.toString('utf8', index, index + stringSize - 1);
// Update parse index position
// Update parse index position
index = index + stringSize;
// Read the oid
var oidBuffer = new Buffer(12);
buffer.copy(oidBuffer, 0, index, index + 12);
var oid = new ObjectID(oidBuffer);
// Read the oid
var oidBuffer = new Buffer(12);
buffer.copy(oidBuffer, 0, index, index + 12);
oid = new ObjectID(oidBuffer);
// Update the index
index = index + 12;
// Update the index
index = index + 12;
// Split the namespace
var parts = namespace.split('.');
var db = parts.shift();
var collection = parts.join('.');
// Upgrade to DBRef type
object[name] = new DBRef(collection, oid, db);
// Split the namespace
var parts = namespace.split('.');
var db = parts.shift();
var collection = parts.join('.');
// Upgrade to DBRef type
object[name] = new DBRef(collection, oid, db);
} else {
throw new Error("Detected unknown BSON type " + elementType.toString(16) + " for fieldname \"" + name + "\", are you using the latest BSON parser");
}
throw new Error(
'Detected unknown BSON type ' +
elementType.toString(16) +
' for fieldname "' +
name +
'", are you using the latest BSON parser'
);
}
}
// Check if the deserialization was against a valid array/object
if(size != (index - startIndex)) {
if(isArray) throw new Error('corrupt array bson');
throw new Error('corrupt object bson');
}
// Check if the deserialization was against a valid array/object
if (size !== index - startIndex) {
if (isArray) throw new Error('corrupt array bson');
throw new Error('corrupt object bson');
}
// Check if we have a db ref object
if(object['$id'] != null) object = new DBRef(object['$ref'], object['$id'], object['$db']);
if (object['$id'] != null) object = new DBRef(object['$ref'], object['$id'], object['$db']);
return object;
}
};
/**
* Ensure eval is isolated.
@ -439,13 +566,13 @@ var isolateEvalWithHash = function(functionCache, hash, functionString, object)
var value = null;
// Check for cache hit, eval if missing and return cached function
if(functionCache[hash] == null) {
eval("value = " + functionString);
if (functionCache[hash] == null) {
eval('value = ' + functionString);
functionCache[hash] = value;
}
// Set the object
return functionCache[hash].bind(object);
}
};
/**
* Ensure eval is isolated.
@ -457,9 +584,9 @@ var isolateEval = function(functionString) {
// Contains the value we are going to set
var value = null;
// Eval the function
eval("value = " + functionString);
eval('value = ' + functionString);
return value;
}
};
var BSON = {};
@ -469,7 +596,7 @@ var BSON = {};
* @ignore
* @api private
*/
var functionCache = BSON.functionCache = {};
var functionCache = (BSON.functionCache = {});
/**
* Number BSON Type
@ -636,18 +763,18 @@ BSON.BSON_BINARY_SUBTYPE_MD5 = 4;
BSON.BSON_BINARY_SUBTYPE_USER_DEFINED = 128;
// BSON MAX VALUES
BSON.BSON_INT32_MAX = 0x7FFFFFFF;
BSON.BSON_INT32_MAX = 0x7fffffff;
BSON.BSON_INT32_MIN = -0x80000000;
BSON.BSON_INT64_MAX = Math.pow(2, 63) - 1;
BSON.BSON_INT64_MIN = -Math.pow(2, 63);
// JS MAX PRECISE VALUES
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
BSON.JS_INT_MAX = 0x20000000000000; // Any integer up to 2^53 can be precisely represented by a double.
BSON.JS_INT_MIN = -0x20000000000000; // Any integer down to -2^53 can be precisely represented by a double.
// Internal long versions
var JS_INT_MAX_LONG = Long.fromNumber(0x20000000000000); // Any integer up to 2^53 can be precisely represented by a double.
var JS_INT_MIN_LONG = Long.fromNumber(-0x20000000000000); // Any integer down to -2^53 can be precisely represented by a double.
var JS_INT_MAX_LONG = Long.fromNumber(0x20000000000000); // Any integer up to 2^53 can be precisely represented by a double.
var JS_INT_MIN_LONG = Long.fromNumber(-0x20000000000000); // Any integer down to -2^53 can be precisely represented by a double.
module.exports = deserialize
module.exports = deserialize;

File diff suppressed because it is too large Load diff

14
node/node_modules/bson/lib/bson/parser/utils.js generated vendored Normal file
View file

@ -0,0 +1,14 @@
'use strict';
/**
* Normalizes our expected stringified form of a function across versions of node
* @param {Function} fn The function to stringify
*/
function normalizedFunctionString(fn) {
return fn.toString().replace(/function *\(/, 'function (');
}
module.exports = {
normalizedFunctionString: normalizedFunctionString
};

View file

@ -5,26 +5,29 @@
* @return {BSONRegExp} A MinKey instance
*/
function BSONRegExp(pattern, options) {
if(!(this instanceof BSONRegExp)) return new BSONRegExp();
if (!(this instanceof BSONRegExp)) return new BSONRegExp();
// Execute
this._bsontype = 'BSONRegExp';
this.pattern = pattern;
this.options = options;
this.pattern = pattern || '';
this.options = options || '';
// Validate options
for(var i = 0; i < options.length; i++) {
if(!(this.options[i] == 'i'
|| this.options[i] == 'm'
|| this.options[i] == 'x'
|| this.options[i] == 'l'
|| this.options[i] == 's'
|| this.options[i] == 'u'
)) {
throw new Error('the regular expression options [' + this.options[i] + "] is not supported");
for (var i = 0; i < this.options.length; i++) {
if (
!(
this.options[i] === 'i' ||
this.options[i] === 'm' ||
this.options[i] === 'x' ||
this.options[i] === 'l' ||
this.options[i] === 's' ||
this.options[i] === 'u'
)
) {
throw new Error('the regular expression options [' + this.options[i] + '] is not supported');
}
}
}
module.exports = BSONRegExp;
module.exports.BSONRegExp = BSONRegExp;
module.exports.BSONRegExp = BSONRegExp;

View file

@ -1,3 +1,6 @@
// Custom inspect property name / symbol.
var inspect = Buffer ? require('util').inspect.custom || 'inspect' : 'inspect';
/**
* A class representation of the BSON Symbol type.
*
@ -7,7 +10,7 @@
* @return {Symbol}
*/
function Symbol(value) {
if(!(this instanceof Symbol)) return new Symbol(value);
if (!(this instanceof Symbol)) return new Symbol(value);
this._bsontype = 'Symbol';
this.value = value;
}
@ -27,21 +30,21 @@ Symbol.prototype.valueOf = function() {
*/
Symbol.prototype.toString = function() {
return this.value;
}
};
/**
* @ignore
*/
Symbol.prototype.inspect = function() {
Symbol.prototype[inspect] = function() {
return this.value;
}
};
/**
* @ignore
*/
Symbol.prototype.toJSON = function() {
return this.value;
}
};
module.exports = Symbol;
module.exports.Symbol = Symbol;
module.exports.Symbol = Symbol;

View file

@ -43,20 +43,20 @@
* @param {number} high the high (signed) 32 bits of the Timestamp.
*/
function Timestamp(low, high) {
if(!(this instanceof Timestamp)) return new Timestamp(low, high);
if (!(this instanceof Timestamp)) return new Timestamp(low, high);
this._bsontype = 'Timestamp';
/**
* @type {number}
* @ignore
*/
this.low_ = low | 0; // force into 32 signed bits.
this.low_ = low | 0; // force into 32 signed bits.
/**
* @type {number}
* @ignore
*/
this.high_ = high | 0; // force into 32 signed bits.
};
this.high_ = high | 0; // force into 32 signed bits.
}
/**
* Return the int value.
@ -74,8 +74,7 @@ Timestamp.prototype.toInt = function() {
* @return {number} the closest floating-point representation to this value.
*/
Timestamp.prototype.toNumber = function() {
return this.high_ * Timestamp.TWO_PWR_32_DBL_ +
this.getLowBitsUnsigned();
return this.high_ * Timestamp.TWO_PWR_32_DBL_ + this.getLowBitsUnsigned();
};
/**
@ -86,7 +85,7 @@ Timestamp.prototype.toNumber = function() {
*/
Timestamp.prototype.toJSON = function() {
return this.toString();
}
};
/**
* Return the String value.
@ -122,9 +121,10 @@ Timestamp.prototype.toString = function(opt_radix) {
// minimize the calls to the very expensive emulated div.
var radixToPower = Timestamp.fromNumber(Math.pow(radix, 6));
var rem = this;
rem = this;
var result = '';
while (true) {
while (!rem.isZero()) {
var remDiv = rem.div(radixToPower);
var intval = rem.subtract(remDiv.multiply(radixToPower)).toInt();
var digits = intval.toString(radix);
@ -168,8 +168,7 @@ Timestamp.prototype.getLowBits = function() {
* @return {number} the low 32-bits as an unsigned value.
*/
Timestamp.prototype.getLowBitsUnsigned = function() {
return (this.low_ >= 0) ?
this.low_ : Timestamp.TWO_PWR_32_DBL_ + this.low_;
return this.low_ >= 0 ? this.low_ : Timestamp.TWO_PWR_32_DBL_ + this.low_;
};
/**
@ -186,13 +185,13 @@ Timestamp.prototype.getNumBitsAbs = function() {
return this.negate().getNumBitsAbs();
}
} else {
var val = this.high_ != 0 ? this.high_ : this.low_;
var val = this.high_ !== 0 ? this.high_ : this.low_;
for (var bit = 31; bit > 0; bit--) {
if ((val & (1 << bit)) != 0) {
if ((val & (1 << bit)) !== 0) {
break;
}
}
return this.high_ != 0 ? bit + 33 : bit + 1;
return this.high_ !== 0 ? bit + 33 : bit + 1;
}
};
@ -203,7 +202,7 @@ Timestamp.prototype.getNumBitsAbs = function() {
* @return {boolean} whether this value is zero.
*/
Timestamp.prototype.isZero = function() {
return this.high_ == 0 && this.low_ == 0;
return this.high_ === 0 && this.low_ === 0;
};
/**
@ -223,7 +222,7 @@ Timestamp.prototype.isNegative = function() {
* @return {boolean} whether this value is odd.
*/
Timestamp.prototype.isOdd = function() {
return (this.low_ & 1) == 1;
return (this.low_ & 1) === 1;
};
/**
@ -234,7 +233,7 @@ Timestamp.prototype.isOdd = function() {
* @return {boolean} whether this Timestamp equals the other
*/
Timestamp.prototype.equals = function(other) {
return (this.high_ == other.high_) && (this.low_ == other.low_);
return this.high_ === other.high_ && this.low_ === other.low_;
};
/**
@ -245,7 +244,7 @@ Timestamp.prototype.equals = function(other) {
* @return {boolean} whether this Timestamp does not equal the other.
*/
Timestamp.prototype.notEquals = function(other) {
return (this.high_ != other.high_) || (this.low_ != other.low_);
return this.high_ !== other.high_ || this.low_ !== other.low_;
};
/**
@ -346,27 +345,30 @@ Timestamp.prototype.add = function(other) {
// Divide each number into 4 chunks of 16 bits, and then sum the chunks.
var a48 = this.high_ >>> 16;
var a32 = this.high_ & 0xFFFF;
var a32 = this.high_ & 0xffff;
var a16 = this.low_ >>> 16;
var a00 = this.low_ & 0xFFFF;
var a00 = this.low_ & 0xffff;
var b48 = other.high_ >>> 16;
var b32 = other.high_ & 0xFFFF;
var b32 = other.high_ & 0xffff;
var b16 = other.low_ >>> 16;
var b00 = other.low_ & 0xFFFF;
var b00 = other.low_ & 0xffff;
var c48 = 0, c32 = 0, c16 = 0, c00 = 0;
var c48 = 0,
c32 = 0,
c16 = 0,
c00 = 0;
c00 += a00 + b00;
c16 += c00 >>> 16;
c00 &= 0xFFFF;
c00 &= 0xffff;
c16 += a16 + b16;
c32 += c16 >>> 16;
c16 &= 0xFFFF;
c16 &= 0xffff;
c32 += a32 + b32;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c48 += a48 + b48;
c48 &= 0xFFFF;
c48 &= 0xffff;
return Timestamp.fromBits((c16 << 16) | c00, (c48 << 16) | c32);
};
@ -405,15 +407,16 @@ Timestamp.prototype.multiply = function(other) {
if (other.isNegative()) {
return this.negate().multiply(other.negate());
} else {
return this.negate().multiply(other).negate();
return this.negate()
.multiply(other)
.negate();
}
} else if (other.isNegative()) {
return this.multiply(other.negate()).negate();
}
// If both Timestamps are small, use float multiplication
if (this.lessThan(Timestamp.TWO_PWR_24_) &&
other.lessThan(Timestamp.TWO_PWR_24_)) {
if (this.lessThan(Timestamp.TWO_PWR_24_) && other.lessThan(Timestamp.TWO_PWR_24_)) {
return Timestamp.fromNumber(this.toNumber() * other.toNumber());
}
@ -421,36 +424,39 @@ Timestamp.prototype.multiply = function(other) {
// We can skip products that would overflow.
var a48 = this.high_ >>> 16;
var a32 = this.high_ & 0xFFFF;
var a32 = this.high_ & 0xffff;
var a16 = this.low_ >>> 16;
var a00 = this.low_ & 0xFFFF;
var a00 = this.low_ & 0xffff;
var b48 = other.high_ >>> 16;
var b32 = other.high_ & 0xFFFF;
var b32 = other.high_ & 0xffff;
var b16 = other.low_ >>> 16;
var b00 = other.low_ & 0xFFFF;
var b00 = other.low_ & 0xffff;
var c48 = 0, c32 = 0, c16 = 0, c00 = 0;
var c48 = 0,
c32 = 0,
c16 = 0,
c00 = 0;
c00 += a00 * b00;
c16 += c00 >>> 16;
c00 &= 0xFFFF;
c00 &= 0xffff;
c16 += a16 * b00;
c32 += c16 >>> 16;
c16 &= 0xFFFF;
c16 &= 0xffff;
c16 += a00 * b16;
c32 += c16 >>> 16;
c16 &= 0xFFFF;
c16 &= 0xffff;
c32 += a32 * b00;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c32 += a16 * b16;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c32 += a00 * b32;
c48 += c32 >>> 16;
c32 &= 0xFFFF;
c32 &= 0xffff;
c48 += a48 * b00 + a32 * b16 + a16 * b32 + a00 * b48;
c48 &= 0xFFFF;
c48 &= 0xffff;
return Timestamp.fromBits((c16 << 16) | c00, (c48 << 16) | c32);
};
@ -469,9 +475,8 @@ Timestamp.prototype.div = function(other) {
}
if (this.equals(Timestamp.MIN_VALUE)) {
if (other.equals(Timestamp.ONE) ||
other.equals(Timestamp.NEG_ONE)) {
return Timestamp.MIN_VALUE; // recall that -MIN_VALUE == MIN_VALUE
if (other.equals(Timestamp.ONE) || other.equals(Timestamp.NEG_ONE)) {
return Timestamp.MIN_VALUE; // recall that -MIN_VALUE == MIN_VALUE
} else if (other.equals(Timestamp.MIN_VALUE)) {
return Timestamp.ONE;
} else {
@ -494,7 +499,9 @@ Timestamp.prototype.div = function(other) {
if (other.isNegative()) {
return this.negate().div(other.negate());
} else {
return this.negate().div(other).negate();
return this.negate()
.div(other)
.negate();
}
} else if (other.isNegative()) {
return this.div(other.negate()).negate();
@ -506,16 +513,16 @@ Timestamp.prototype.div = function(other) {
// the approximate value is less than or equal to the real value so that the
// remainder never becomes negative.
var res = Timestamp.ZERO;
var rem = this;
rem = this;
while (rem.greaterThanOrEqual(other)) {
// Approximate the result of division. This may be a little greater or
// smaller than the actual value.
var approx = Math.max(1, Math.floor(rem.toNumber() / other.toNumber()));
approx = Math.max(1, Math.floor(rem.toNumber() / other.toNumber()));
// We will tweak the approximate result by changing it in the 48-th digit or
// the smallest non-fractional digit, whichever is larger.
var log2 = Math.ceil(Math.log(approx) / Math.LN2);
var delta = (log2 <= 48) ? 1 : Math.pow(2, log2 - 48);
var delta = log2 <= 48 ? 1 : Math.pow(2, log2 - 48);
// Decrease the approximation until it is smaller than the remainder. Note
// that if it is too large, the product overflows and is negative.
@ -602,15 +609,13 @@ Timestamp.prototype.xor = function(other) {
*/
Timestamp.prototype.shiftLeft = function(numBits) {
numBits &= 63;
if (numBits == 0) {
if (numBits === 0) {
return this;
} else {
var low = this.low_;
if (numBits < 32) {
var high = this.high_;
return Timestamp.fromBits(
low << numBits,
(high << numBits) | (low >>> (32 - numBits)));
return Timestamp.fromBits(low << numBits, (high << numBits) | (low >>> (32 - numBits)));
} else {
return Timestamp.fromBits(0, low << (numBits - 32));
}
@ -626,19 +631,15 @@ Timestamp.prototype.shiftLeft = function(numBits) {
*/
Timestamp.prototype.shiftRight = function(numBits) {
numBits &= 63;
if (numBits == 0) {
if (numBits === 0) {
return this;
} else {
var high = this.high_;
if (numBits < 32) {
var low = this.low_;
return Timestamp.fromBits(
(low >>> numBits) | (high << (32 - numBits)),
high >> numBits);
return Timestamp.fromBits((low >>> numBits) | (high << (32 - numBits)), high >> numBits);
} else {
return Timestamp.fromBits(
high >> (numBits - 32),
high >= 0 ? 0 : -1);
return Timestamp.fromBits(high >> (numBits - 32), high >= 0 ? 0 : -1);
}
}
};
@ -652,16 +653,14 @@ Timestamp.prototype.shiftRight = function(numBits) {
*/
Timestamp.prototype.shiftRightUnsigned = function(numBits) {
numBits &= 63;
if (numBits == 0) {
if (numBits === 0) {
return this;
} else {
var high = this.high_;
if (numBits < 32) {
var low = this.low_;
return Timestamp.fromBits(
(low >>> numBits) | (high << (32 - numBits)),
high >>> numBits);
} else if (numBits == 32) {
return Timestamp.fromBits((low >>> numBits) | (high << (32 - numBits)), high >>> numBits);
} else if (numBits === 32) {
return Timestamp.fromBits(high, 0);
} else {
return Timestamp.fromBits(high >>> (numBits - 32), 0);
@ -709,8 +708,9 @@ Timestamp.fromNumber = function(value) {
return Timestamp.fromNumber(-value).negate();
} else {
return new Timestamp(
(value % Timestamp.TWO_PWR_32_DBL_) | 0,
(value / Timestamp.TWO_PWR_32_DBL_) | 0);
(value % Timestamp.TWO_PWR_32_DBL_) | 0,
(value / Timestamp.TWO_PWR_32_DBL_) | 0
);
}
};
@ -735,7 +735,7 @@ Timestamp.fromBits = function(lowBits, highBits) {
* @return {Timestamp} the corresponding Timestamp value.
*/
Timestamp.fromString = function(str, opt_radix) {
if (str.length == 0) {
if (str.length === 0) {
throw Error('number format error: empty string');
}
@ -744,7 +744,7 @@ Timestamp.fromString = function(str, opt_radix) {
throw Error('radix out of range: ' + radix);
}
if (str.charAt(0) == '-') {
if (str.charAt(0) === '-') {
return Timestamp.fromString(str.substring(1), radix).negate();
} else if (str.indexOf('-') >= 0) {
throw Error('number format error: interior "-" character: ' + str);
@ -772,7 +772,6 @@ Timestamp.fromString = function(str, opt_radix) {
// NOTE: Common constant values ZERO, ONE, NEG_ONE, etc. are defined below the
// from* methods on which they depend.
/**
* A cache of the Timestamp representations of small integer values.
* @type {Object}
@ -837,8 +836,7 @@ Timestamp.ONE = Timestamp.fromInt(1);
Timestamp.NEG_ONE = Timestamp.fromInt(-1);
/** @type {Timestamp} */
Timestamp.MAX_VALUE =
Timestamp.fromBits(0xFFFFFFFF | 0, 0x7FFFFFFF | 0);
Timestamp.MAX_VALUE = Timestamp.fromBits(0xffffffff | 0, 0x7fffffff | 0);
/** @type {Timestamp} */
Timestamp.MIN_VALUE = Timestamp.fromBits(0, 0x80000000 | 0);
@ -853,4 +851,4 @@ Timestamp.TWO_PWR_24_ = Timestamp.fromInt(1 << 24);
* Expose.
*/
module.exports = Timestamp;
module.exports.Timestamp = Timestamp;
module.exports.Timestamp = Timestamp;

97
node/node_modules/bson/package.json generated vendored
View file

@ -1,49 +1,26 @@
{
"_args": [
[
{
"raw": "bson@~0.5.7",
"scope": null,
"escapedName": "bson",
"name": "bson",
"rawSpec": "~0.5.7",
"spec": ">=0.5.7 <0.6.0",
"type": "range"
},
"/Users/sclay/projects/newsblur/node/node_modules/mongodb-core"
]
],
"_from": "bson@>=0.5.7 <0.6.0",
"_id": "bson@0.5.7",
"_inCache": true,
"_from": "bson@~1.0.4",
"_id": "bson@1.0.9",
"_inBundle": false,
"_integrity": "sha512-IQX9/h7WdMBIW/q/++tGd+emQr0XMdeZ6icnT/74Xk9fnabWn+gZgpE+9V+gujL3hhJOoNrnDVY7tWdzc7NUTg==",
"_location": "/bson",
"_nodeVersion": "6.9.1",
"_npmOperationalInternal": {
"host": "packages-18-east.internal.npmjs.com",
"tmp": "tmp/bson-0.5.7.tgz_1479477418380_0.5046395626850426"
},
"_npmUser": {
"name": "christkv",
"email": "christkv@gmail.com"
},
"_npmVersion": "3.10.8",
"_phantomChildren": {},
"_requested": {
"raw": "bson@~0.5.7",
"scope": null,
"escapedName": "bson",
"type": "range",
"registry": true,
"raw": "bson@~1.0.4",
"name": "bson",
"rawSpec": "~0.5.7",
"spec": ">=0.5.7 <0.6.0",
"type": "range"
"escapedName": "bson",
"rawSpec": "~1.0.4",
"saveSpec": null,
"fetchSpec": "~1.0.4"
},
"_requiredBy": [
"/mongodb-core"
],
"_resolved": "https://registry.npmjs.org/bson/-/bson-0.5.7.tgz",
"_shasum": "0d11fe0936c1fee029e11f7063f5d0ab2422ea3e",
"_shrinkwrap": null,
"_spec": "bson@~0.5.7",
"_resolved": "https://registry.npmjs.org/bson/-/bson-1.0.9.tgz",
"_shasum": "12319f8323b1254739b7c6bef8d3e89ae05a2f57",
"_spec": "bson@~1.0.4",
"_where": "/Users/sclay/projects/newsblur/node/node_modules/mongodb-core",
"author": {
"name": "Christian Amor Kvalheim",
@ -53,37 +30,39 @@
"bugs": {
"url": "https://github.com/mongodb/js-bson/issues"
},
"bundleDependencies": false,
"config": {
"native": false
},
"contributors": [],
"dependencies": {},
"deprecated": false,
"description": "A bson parser for node.js and the browser",
"devDependencies": {
"babel-core": "^6.14.0",
"babel-loader": "^6.2.5",
"babel-polyfill": "^6.13.0",
"babel-preset-es2015": "^6.14.0",
"babel-preset-stage-0": "^6.5.0",
"babel-register": "^6.14.0",
"benchmark": "1.0.0",
"colors": "1.1.0",
"gleak": "0.2.3",
"nodeunit": "0.9.0"
"conventional-changelog-cli": "^1.3.5",
"nodeunit": "0.9.0",
"webpack": "^1.13.2",
"webpack-polyfills-plugin": "0.0.9"
},
"directories": {
"lib": "./lib/bson"
},
"dist": {
"shasum": "0d11fe0936c1fee029e11f7063f5d0ab2422ea3e",
"tarball": "https://registry.npmjs.org/bson/-/bson-0.5.7.tgz"
},
"engines": {
"node": ">=0.6.19"
},
"files": [
"lib",
"index.js",
"browser_build",
"alternate_parsers",
"bower.json",
"tools",
"deserializer_bak.js"
"bower.json"
],
"gitHead": "c3bc67c4e89e795beca2d6d309e7a840b740574d",
"homepage": "https://github.com/mongodb/js-bson#readme",
"keywords": [
"mongodb",
@ -91,26 +70,18 @@
"parser"
],
"license": "Apache-2.0",
"main": "./lib/bson/index",
"maintainers": [
{
"name": "octave",
"email": "chinsay@gmail.com"
},
{
"name": "christkv",
"email": "christkv@gmail.com"
}
],
"main": "./index",
"name": "bson",
"optionalDependencies": {},
"readme": "ERROR: No README data found!",
"repository": {
"type": "git",
"url": "git+https://github.com/mongodb/js-bson.git"
},
"scripts": {
"build": "webpack --config ./webpack.dist.config.js",
"changelog": "conventional-changelog -p angular -i HISTORY.md -s",
"format": "prettier --print-width 100 --tab-width 2 --single-quote --write 'test/**/*.js' 'lib/**/*.js'",
"lint": "eslint lib test",
"test": "nodeunit ./test/node"
},
"version": "0.5.7"
"version": "1.0.9"
}

View file

@ -1,21 +0,0 @@
var gleak = require('gleak')();
gleak.ignore('AssertionError');
gleak.ignore('testFullSpec_param_found');
gleak.ignore('events');
gleak.ignore('Uint8Array');
gleak.ignore('Uint8ClampedArray');
gleak.ignore('TAP_Global_Harness');
gleak.ignore('setImmediate');
gleak.ignore('clearImmediate');
gleak.ignore('DTRACE_NET_SERVER_CONNECTION');
gleak.ignore('DTRACE_NET_STREAM_END');
gleak.ignore('DTRACE_NET_SOCKET_READ');
gleak.ignore('DTRACE_NET_SOCKET_WRITE');
gleak.ignore('DTRACE_HTTP_SERVER_REQUEST');
gleak.ignore('DTRACE_HTTP_SERVER_RESPONSE');
gleak.ignore('DTRACE_HTTP_CLIENT_REQUEST');
gleak.ignore('DTRACE_HTTP_CLIENT_RESPONSE');
module.exports = gleak;

View file

@ -1,3 +1,148 @@
<a name="2.1.19"></a>
## [2.1.19](https://github.com/christkv/mongodb-core/compare/v2.1.18...v2.1.19) (2018-02-26)
### Bug Fixes
* **replset:** remove primary from topology on failure to connect ([2a246b5](https://github.com/christkv/mongodb-core/commit/2a246b5))
<a name="2.1.18"></a>
## [2.1.18](https://github.com/christkv/mongodb-core/compare/v2.1.17...v2.1.18) (2018-01-02)
### Bug Fixes
* **auth:** don't redeclare BSON variable in plain auth ([4fc77e3](https://github.com/christkv/mongodb-core/commit/4fc77e3))
* **auth:** remove extra bson include ([4411d2c](https://github.com/christkv/mongodb-core/commit/4411d2c))
* **connection:** accept incoming missed ssl options from mongo-native-driver ([fd543eb](https://github.com/christkv/mongodb-core/commit/fd543eb))
* **connection:** added missed tls option ecdhCurve ([ca1d909](https://github.com/christkv/mongodb-core/commit/ca1d909))
* **connection:** default `family` to undefined rather than 4 ([09916ae](https://github.com/christkv/mongodb-core/commit/09916ae))
* **connection:** fixing leaky connection ([#234](https://github.com/christkv/mongodb-core/issues/234)) ([7633f10](https://github.com/christkv/mongodb-core/commit/7633f10))
* **cursor:** check for autoReconnect option only for single server ([c841eb5](https://github.com/christkv/mongodb-core/commit/c841eb5))
* **secondaries:** fixes connection with secondary readPreference ([5763f5c](https://github.com/christkv/mongodb-core/commit/5763f5c))
* **timeout:** fixed compatibility with node <=0.12.x ([c7c72b2](https://github.com/christkv/mongodb-core/commit/c7c72b2)), closes [mongodb-js/mongodb-core#247](https://github.com/mongodb-js/mongodb-core/issues/247) [mongodb-js/mongodb-core#247](https://github.com/mongodb-js/mongodb-core/issues/247) [mongodb-js/mongodb-core#248](https://github.com/mongodb-js/mongodb-core/issues/248) [#248](https://github.com/christkv/mongodb-core/issues/248)
### Features
* **replicaset:** More verbose replica set errors emission: ([#231](https://github.com/christkv/mongodb-core/issues/231)) ([de6d220](https://github.com/christkv/mongodb-core/commit/de6d220))
2.1.17 2017-10-11
-----------------
* fix a typo that completely broke SCRAM-SHA1 authentication
2.1.16 2017-10-11
-----------------
* avoid waiting for reconnect if reconnect disabled in Server topology
* avoid waiting for reconnect if reconnect disabled in Cursor
* NODE-990 cache the ScramSHA1 salted passwords up to 200 entries
* NODE-1153 ensure that errors are propagated on force destroy
* NODE-1153 ensure inUse and connecting queues are cleared on reauth
2.1.15 2017-08-08
-----------------
* Emit SDAM events on close and reconnect
2.1.14 2017-07-07
-----------------
* NODE-1073 updates scram.js hi() algorithm to utilize crypto.pbkdf2Sync()
* NODE-1049 only include primary server if there are no secondary servers for
readPrefrence secondaryPreferred
* moved `assign` polyfill to shared utils, replace usage of `extend` in tests
2.1.13 2017-06-19
-----------------
* NODE-1039 ensure we force destroy server instances, forcing queue to be flushed.
* Use actual server type in standalone SDAM events.
2.1.12 2017-06-02
-----------------
* NODE-1019 Set keepAlive to 300 seconds or 1/2 of socketTimeout if socketTimeout < keepAlive.
* Minor fix to report the correct state on error.
* NODE-1020 'family' was added to options to provide high priority for ipv6 addresses (Issue #1518, https://github.com/firej).
* Fix require_optional loading of bson-ext.
* Ensure no errors are thrown by replset if topology is destroyed before it finished connecting.
* NODE-999 SDAM fixes for Mongos and single Server event emitting.
* NODE-1014 Set socketTimeout to default to 360 seconds.
* NODE-1019 Set keepAlive to 300 seconds or 1/2 of socketTimeout if socketTimeout < keepAlive.
2.1.11 2017-05-22
-----------------
* NODE-987 Clear out old intervalIds on when calling topologyMonitor.
* NODE-987 Moved filtering to pingServer method and added test case.
* Check for connection destroyed just before writing out and flush out operations correctly if it is (Issue #179, https://github.com/jmholzinger).
* NODE-989 Refactored Replicaset monitoring to correcly monitor newly added servers, Also extracted setTimeout and setInterval to use custom wrappers Timeout and Interva.
2.1.10 2017-04-18
-----------------
* NODE-981 delegate auth to replset/mongos if inTopology is set.
* NODE-978 Wrap connection.end in try/catch for node 0.10.x issue causing exceptions to be thrown, Also surfaced getConnection for mongos and replset.
* Remove dynamic require (Issue #175, https://github.com/tellnes).
* NODE-696 Handle interrupted error for createIndexes.
* Fixed isse when user is executing find command using Server.command and it get interpreted as a wire protcol message, #172.
* NODE-966 promoteValues not being promoted correctly to getMore.
* Merged in fix for flushing out monitoring operations.
2.1.9 2017-03-17
----------------
* Return lastIsMaster correctly when connecting with secondaryOnlyConnectionAllowed is set to true and only a secondary is available in replica state.
* Clone options when passed to wireProtocol handler to avoid intermittent modifications causing errors.
* Ensure SSL error propegates better for Replset connections when there is a SSL validation error.
* NODE-957 Fixed issue where < batchSize not causing cursor to be closed on execution of first batch.
* NODE-958 Store reconnectConnection on pool object to allow destroy to close immediately.
2.1.8 2017-02-13
----------------
* NODE-925 ensure we reschedule operations while pool is < poolSize while pool is growing and there are no connections with not currently performing work.
* NODE-927 fixes issue where authentication was performed against arbiter instances.
* NODE-915 Normalize all host names to avoid comparison issues.
* Fixed issue where pool.destroy would never finish due to a single operation not being executed and keeping it open.
2.1.7 2017-01-24
----------------
* NODE-919 ReplicaSet connection does not close immediately (Issue #156).
* NODE-901 Fixed bug when normalizing host names.
* NODE-909 Fixed readPreference issue caused by direct connection to primary.
* NODE-910 Fixed issue when bufferMaxEntries == 0 and read preference set to nearest.
2.1.6 2017-01-13
----------------
* NODE-908 Keep auth contexts in replset and mongos topology to ensure correct application of authentication credentials when primary is first server to be detected causing an immediate connect event to happen.
2.1.5 2017-01-11
----------------
* updated bson and bson-ext dependencies to 1.0.4 to work past early node 4.x.x version having a broken Buffer.from implementation.
2.1.4 2017-01-03
----------------
* updated bson and bson-ext dependencies to 1.0.3 due to util.inspect issue with ObjectId optimizations.
2.1.3 2017-01-03
----------------
* Monitoring operations are re-scheduled in pool if it cannot find a connection that does not already have scheduled work on it, this is to avoid the monitoring socket timeout being applied to any existing operations on the socket due to pipelining
* Moved replicaset monitoring away from serial mode and to parallel mode.
* updated bson and bson-ext dependencies to 1.0.2.
2.1.2 2016-12-10
----------------
* Delay topologyMonitoring on successful attemptReconnect as no need to run a full scan immediately.
* Emit reconnect event in primary joining when in connected status for a replicaset.
2.1.1 2016-12-08
----------------
* Updated bson library to 1.0.1.
* Added optional support for bson-ext 1.0.1.
2.1.0 2016-12-05
----------------
* Updated bson library to 1.0.0.
* Added optional support for bson-ext 1.0.0.
* Expose property parserType allowing for identification of currently configured parser.
2.0.14 2016-11-29
-----------------
* Updated bson library to 0.5.7.

View file

View file

@ -1,72 +0,0 @@
var Server = require('./lib/topologies/server');
// Attempt to connect
var server = new Server({
host: 'localhost', port: 27017, socketTimeout: 500
});
// function executeCursors(_server, cb) {
// var count = 100;
//
// for(var i = 0; i < 100; i++) {
// // Execute the write
// var cursor = _server.cursor('test.test', {
// find: 'test.test'
// , query: {a:1}
// }, {readPreference: new ReadPreference('secondary')});
//
// // Get the first document
// cursor.next(function(err, doc) {
// count = count - 1;
// if(err) console.dir(err)
// if(count == 0) return cb();
// });
// }
// }
server.on('connect', function(_server) {
// console.log("===== connect")
setInterval(function() {
_server.insert('test.test', [{a:1}], function(err, r) {
console.log("insert")
});
}, 1000)
// console.log("---------------------------------- 0")
// // Attempt authentication
// _server.auth('scram-sha-1', 'admin', 'root', 'root', function(err, r) {
// console.log("---------------------------------- 1")
// // console.dir(err)
// // console.dir(r)
//
// _server.insert('test.test', [{a:1}], function(err, r) {
// console.log("---------------------------------- 2")
// console.dir(err)
// if(r)console.dir(r.result)
// var name = null;
//
// _server.on('joined', function(_t, _server) {
// if(name == _server.name) {
// console.log("=========== joined :: " + _t + " :: " + _server.name)
// executeCursors(_server, function() {
// });
// }
// })
//
// // var s = _server.s.replicaSetState.secondaries[0];
// // s.destroy({emitClose:true});
// executeCursors(_server, function() {
// console.log("============== 0")
// // Attempt to force a server reconnect
// var s = _server.s.replicaSetState.secondaries[0];
// name = s.name;
// s.destroy({emitClose:true});
// // console.log("============== 1")
//
// // _server.destroy();
// // test.done();
// });
// });
// });
});
server.connect();

View file

@ -1,21 +1,15 @@
// module.exports = {
// MongoError: require('./lib/error')
// , Server: require('./lib/topologies/server')
// , ReplSet: require('./lib/topologies/replset')
// , Mongos: require('./lib/topologies/mongos')
// , Logger: require('./lib/connection/logger')
// , Cursor: require('./lib/cursor')
// , ReadPreference: require('./lib/topologies/read_preference')
// , BSON: require('bson')
// // Raw operations
// , Query: require('./lib/connection/commands').Query
// // Auth mechanisms
// , MongoCR: require('./lib/auth/mongocr')
// , X509: require('./lib/auth/x509')
// , Plain: require('./lib/auth/plain')
// , GSSAPI: require('./lib/auth/gssapi')
// , ScramSHA1: require('./lib/auth/scram')
// }
var BSON = require('bson');
var require_optional = require('require_optional');
try {
// Attempt to grab the native BSON parser
var BSONNative = require_optional('bson-ext');
// If we got the native parser, use it instead of the
// Javascript one
if(BSONNative) {
BSON = BSONNative
}
} catch(err) {}
module.exports = {
MongoError: require('./lib/error')
@ -26,7 +20,7 @@ module.exports = {
, Logger: require('./lib/connection/logger')
, Cursor: require('./lib/cursor')
, ReadPreference: require('./lib/topologies/read_preference')
, BSON: require('bson')
, BSON: BSON
// Raw operations
, Query: require('./lib/connection/commands').Query
// Auth mechanisms

View file

@ -1,10 +1,13 @@
"use strict";
var f = require('util').format
, Binary = require('bson').Binary
, retrieveBSON = require('../connection/utils').retrieveBSON
, Query = require('../connection/commands').Query
, MongoError = require('../error');
var BSON = retrieveBSON()
, Binary = BSON.Binary;
var AuthSession = function(db, username, password) {
this.db = db;
this.username = username;

View file

@ -2,10 +2,13 @@
var f = require('util').format
, crypto = require('crypto')
, retrieveBSON = require('../connection/utils').retrieveBSON
, Query = require('../connection/commands').Query
, Binary = require('bson').Binary
, MongoError = require('../error');
var BSON = retrieveBSON(),
Binary = BSON.Binary;
var AuthSession = function(db, username, password) {
this.db = db;
this.username = username;
@ -71,27 +74,32 @@ var xor = function(a, b) {
return new Buffer(res);
}
// Create a final digest
var _hiCache = {};
var _hiCacheCount = 0;
var _hiCachePurge = function() {
_hiCache = {};
_hiCacheCount = 0;
};
var hi = function(data, salt, iterations) {
// Create digest
var digest = function(msg) {
var hmac = crypto.createHmac('sha1', data);
hmac.update(msg);
return new Buffer(hmac.digest('base64'), 'base64');
// omit the work if already generated
var key = [data, salt.toString('base64'), iterations].join('_');
if (_hiCache[key] !== undefined) {
return _hiCache[key];
}
// Create variables
salt = Buffer.concat([salt, new Buffer('\x00\x00\x00\x01')])
var ui = digest(salt);
var u1 = ui;
// generate the salt
var saltedData = crypto.pbkdf2Sync(data, salt, iterations, 20, "sha1");
for(var i = 0; i < iterations - 1; i++) {
u1 = digest(u1);
ui = xor(ui, u1);
// cache a copy to speed up the next lookup, but prevent unbounded cache growth
if (_hiCacheCount >= 200) {
_hiCachePurge();
}
return ui;
}
_hiCache[key] = saltedData;
_hiCacheCount += 1;
return saltedData;
};
/**
* Authenticate
@ -230,15 +238,6 @@ ScramSHA1.prototype.auth = function(server, connections, db, username, password,
// Create client final
var clientFinal = [withoutProof, clientProof].join(',');
// Generate server key
hmac = crypto.createHmac('sha1', saltedPassword);
hmac.update(new Buffer('Server Key'))
var serverKey = new Buffer(hmac.digest('base64'), 'base64');
// Generate server signature
hmac = crypto.createHmac('sha1', serverKey);
hmac.update(new Buffer(authMsg))
//
// Create continue message
var cmd = {

View file

@ -1,6 +1,8 @@
"use strict";
var Long = require('bson').Long;
var retrieveBSON = require('../connection/utils').retrieveBSON;
var BSON = retrieveBSON();
var Long = BSON.Long;
// Incrementing request id
var _requestId = 0;
@ -34,7 +36,7 @@ var Query = function(bson, ns, query, options) {
if(ns == null) throw new Error("ns must be specified for query");
if(query == null) throw new Error("query must be specified for query");
// Validate that we are not passing 0x00 in the colletion name
// Validate that we are not passing 0x00 in the collection name
if(!!~ns.indexOf("\x00")) {
throw new Error("namespace cannot contain a null character");
}
@ -135,18 +137,22 @@ Query.prototype.toBin = function() {
buffers.push(header);
// Serialize the query
var query = self.bson.serialize(this.query
, this.checkKeys
, true
, this.serializeFunctions
, 0, this.ignoreUndefined);
var query = self.bson.serialize(this.query, {
checkKeys: this.checkKeys,
serializeFunctions: this.serializeFunctions,
ignoreUndefined: this.ignoreUndefined,
});
// Add query document
buffers.push(query);
if(self.returnFieldSelector && Object.keys(self.returnFieldSelector).length > 0) {
// Serialize the projection document
projection = self.bson.serialize(this.returnFieldSelector, this.checkKeys, true, this.serializeFunctions, this.ignoreUndefined);
projection = self.bson.serialize(this.returnFieldSelector, {
checkKeys: this.checkKeys,
serializeFunctions: this.serializeFunctions,
ignoreUndefined: this.ignoreUndefined,
});
// Add projection document
buffers.push(projection);
}
@ -472,6 +478,13 @@ Response.prototype.parse = function(options) {
: this.opts.promoteBuffers
var bsonSize, _options;
// Set up the options
_options = {
promoteLongs: promoteLongs,
promoteValues: promoteValues,
promoteBuffers: promoteBuffers
};
//
// Single document and documentsReturnedIn set
//
@ -483,13 +496,7 @@ Response.prototype.parse = function(options) {
// Set up field we wish to keep as raw
var fieldsAsRaw = {}
fieldsAsRaw[documentsReturnedIn] = true;
// Set up the options
_options = {
promoteLongs: promoteLongs,
promoteValues: promoteValues,
promoteBuffers: promoteBuffers,
fieldsAsRaw: fieldsAsRaw
};
_options.fieldsAsRaw = fieldsAsRaw;
// Deserialize but keep the array of documents in non-parsed form
var doc = this.bson.deserialize(document, _options);
@ -515,8 +522,6 @@ Response.prototype.parse = function(options) {
//
for(var i = 0; i < this.numberReturned; i++) {
bsonSize = this.data[this.index] | this.data[this.index + 1] << 8 | this.data[this.index + 2] << 16 | this.data[this.index + 3] << 24;
// Parse options
_options = {promoteLongs: promoteLongs, promoteValues: promoteValues, promoteBuffers: promoteBuffers};
// If we have raw results specified slice the return document
if(raw) {

View file

@ -13,7 +13,7 @@ var inherits = require('util').inherits
var _id = 0;
var debugFields = ['host', 'port', 'size', 'keepAlive', 'keepAliveInitialDelay', 'noDelay'
, 'connectionTimeout', 'socketTimeout', 'singleBufferSerializtion', 'ssl', 'ca', 'cert'
, 'connectionTimeout', 'socketTimeout', 'singleBufferSerializtion', 'ssl', 'ca', 'crl', 'cert'
, 'rejectUnauthorized', 'promoteLongs', 'promoteValues', 'promoteBuffers', 'checkServerIdentity'];
var connectionAccounting = false;
var connections = {};
@ -23,15 +23,17 @@ var connections = {};
* @class
* @param {string} options.host The server host
* @param {number} options.port The server port
* @param {number} [options.family=null] IP version for DNS lookup, passed down to Node's [`dns.lookup()` function](https://nodejs.org/api/dns.html#dns_dns_lookup_hostname_options_callback). If set to `6`, will only look for ipv6 addresses.
* @param {boolean} [options.keepAlive=true] TCP Connection keep alive enabled
* @param {number} [options.keepAliveInitialDelay=0] Initial delay before TCP keep alive enabled
* @param {number} [options.keepAliveInitialDelay=300000] Initial delay before TCP keep alive enabled
* @param {boolean} [options.noDelay=true] TCP Connection no delay
* @param {number} [options.connectionTimeout=0] TCP Connection timeout setting
* @param {number} [options.socketTimeout=0] TCP Socket timeout setting
* @param {number} [options.connectionTimeout=30000] TCP Connection timeout setting
* @param {number} [options.socketTimeout=360000] TCP Socket timeout setting
* @param {boolean} [options.singleBufferSerializtion=true] Serialize into single buffer, trade of peak memory for serialization speed
* @param {boolean} [options.ssl=false] Use SSL for connection
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {Buffer} [options.ca] SSL Certificate store binary buffer
* @param {Buffer} [options.crl] SSL Certificate revocation store binary buffer
* @param {Buffer} [options.cert] SSL Certificate binary buffer
* @param {Buffer} [options.key] SSL Key file binary buffer
* @param {string} [options.passphrase] SSL Certificate pass phrase
@ -72,11 +74,20 @@ var Connection = function(messageHandler, options) {
// Default options
this.port = options.port || 27017;
this.host = options.host || 'localhost';
this.family = typeof options.family == 'number' ? options.family : void 0;
this.keepAlive = typeof options.keepAlive == 'boolean' ? options.keepAlive : true;
this.keepAliveInitialDelay = options.keepAliveInitialDelay || 0;
this.keepAliveInitialDelay = typeof options.keepAliveInitialDelay == 'number'
? options.keepAliveInitialDelay : 300000;
this.noDelay = typeof options.noDelay == 'boolean' ? options.noDelay : true;
this.connectionTimeout = options.connectionTimeout || 0;
this.socketTimeout = options.socketTimeout || 0;
this.connectionTimeout = typeof options.connectionTimeout == 'number'
? options.connectionTimeout : 30000;
this.socketTimeout = typeof options.socketTimeout == 'number'
? options.socketTimeout : 360000;
// Is the keepAliveInitialDelay > socketTimeout set it to half of socketTimeout
if(this.keepAliveInitialDelay > this.socketTimeout) {
this.keepAliveInitialDelay = Math.round(this.socketTimeout/2);
}
// If connection was destroyed
this.destroyed = false;
@ -90,9 +101,12 @@ var Connection = function(messageHandler, options) {
// SSL options
this.ca = options.ca || null;
this.crl = options.crl || null;
this.cert = options.cert || null;
this.key = options.key || null;
this.passphrase = options.passphrase || null;
this.ciphers = options.ciphers || null;
this.ecdhCurve = options.ecdhCurve || null;
this.ssl = typeof options.ssl == 'boolean' ? options.ssl : false;
this.rejectUnauthorized = typeof options.rejectUnauthorized == 'boolean' ? options.rejectUnauthorized : true;
this.checkServerIdentity = typeof options.checkServerIdentity == 'boolean'
@ -155,12 +169,12 @@ Connection.connections = function() {
}
function deleteConnection(id) {
// console.log("=== delete connection :: " + id)
// console.log("=== deleted connection " + id + " :: " + (connections[id] ? connections[id].port : ''))
delete connections[id];
}
function addConnection(id, connection) {
// console.log("=== add connection :: " + id)
// console.log("=== added connection " + id + " :: " + connection.port)
connections[id] = connection;
}
@ -364,7 +378,7 @@ var dataHandler = function(self) {
// List of socket level valid ssl options
var legalSslSocketOptions = ['pfx', 'key', 'passphrase', 'cert', 'ca', 'ciphers'
, 'NPNProtocols', 'ALPNProtocols', 'servername'
, 'NPNProtocols', 'ALPNProtocols', 'servername', 'ecdhCurve'
, 'secureProtocol', 'secureContext', 'session'
, 'minDHSize'];
@ -394,9 +408,16 @@ Connection.prototype.connect = function(_options) {
}
// Create new connection instance
self.connection = self.domainSocket
? net.createConnection(self.host)
: net.createConnection(self.port, self.host);
var connection_options;
if (self.domainSocket) {
connection_options = {path: self.host};
} else {
connection_options = {port: self.port, host: self.host};
if (self.family !== void 0) {
connection_options.family = self.family;
}
}
self.connection = net.createConnection(connection_options);
// Set the options for the connection
self.connection.setKeepAlive(self.keepAlive, self.keepAliveInitialDelay);
@ -416,6 +437,7 @@ Connection.prototype.connect = function(_options) {
// Set options for ssl
if(self.ca) sslOptions.ca = self.ca;
if(self.crl) sslOptions.crl = self.crl;
if(self.cert) sslOptions.cert = self.cert;
if(self.key) sslOptions.key = self.key;
if(self.passphrase) sslOptions.passphrase = self.passphrase;
@ -450,7 +472,7 @@ Connection.prototype.connect = function(_options) {
});
self.connection.setTimeout(self.connectionTimeout);
} else {
self.connection.on('connect', function() {
self.connection.once('connect', function() {
// Set socket timeout instead of connection timeout
self.connection.setTimeout(self.socketTimeout);
// Emit connect event
@ -488,7 +510,9 @@ Connection.prototype.destroy = function() {
// Set the connections
if(connectionAccounting) deleteConnection(this.id);
if(this.connection) {
this.connection.end();
// Catch posssible exception thrown by node 0.10.x
try { this.connection.end(); } catch (err) {}
// Destroy connection
this.connection.destroy();
}
@ -512,10 +536,21 @@ Connection.prototype.write = function(buffer) {
}
}
// Write out the command
if(!Array.isArray(buffer)) return this.connection.write(buffer, 'binary');
// Iterate over all buffers and write them in order to the socket
for(i = 0; i < buffer.length; i++) this.connection.write(buffer[i], 'binary');
// Double check that the connection is not destroyed
if(this.connection.destroyed === false) {
// Write out the command
if(!Array.isArray(buffer)) {
this.connection.write(buffer, 'binary');
return true;
}
// Iterate over all buffers and write them in order to the socket
for(i = 0; i < buffer.length; i++) this.connection.write(buffer[i], 'binary');
return true;
}
// Connection is destroyed return write failed
return false;
}
/**

View file

@ -8,7 +8,7 @@ var inherits = require('util').inherits,
f = require('util').format,
Query = require('./commands').Query,
CommandResult = require('./command_result'),
assign = require('../topologies/shared').assign;
assign = require('../utils').assign;
var MongoCR = require('../auth/mongocr')
, X509 = require('../auth/x509')
@ -35,14 +35,15 @@ var _id = 0;
* @param {number} [options.reconnectTries=30] Server attempt to reconnect #times
* @param {number} [options.reconnectInterval=1000] Server will wait # milliseconds between retries
* @param {boolean} [options.keepAlive=true] TCP Connection keep alive enabled
* @param {number} [options.keepAliveInitialDelay=0] Initial delay before TCP keep alive enabled
* @param {number} [options.keepAliveInitialDelay=300000] Initial delay before TCP keep alive enabled
* @param {boolean} [options.noDelay=true] TCP Connection no delay
* @param {number} [options.connectionTimeout=0] TCP Connection timeout setting
* @param {number} [options.socketTimeout=0] TCP Socket timeout setting
* @param {number} [options.connectionTimeout=30000] TCP Connection timeout setting
* @param {number} [options.socketTimeout=360000] TCP Socket timeout setting
* @param {number} [options.monitoringSocketTimeout=30000] TCP Socket timeout setting for replicaset monitoring socket
* @param {boolean} [options.ssl=false] Use SSL for connection
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {Buffer} [options.ca] SSL Certificate store binary buffer
* @param {Buffer} [options.crl] SSL Certificate revocation store binary buffer
* @param {Buffer} [options.cert] SSL Certificate binary buffer
* @param {Buffer} [options.key] SSL Key file binary buffer
* @param {string} [options.passPhrase] SSL Certificate pass phrase
@ -70,13 +71,13 @@ var Pool = function(options) {
size: 5,
// socket settings
connectionTimeout: 30000,
socketTimeout: 30000,
socketTimeout: 360000,
keepAlive: true,
keepAliveInitialDelay: 0,
keepAliveInitialDelay: 300000,
noDelay: true,
// SSL Settings
ssl: false, checkServerIdentity: true,
ca: null, cert: null, key: null, passPhrase: null,
ca: null, crl: null, cert: null, key: null, passPhrase: null,
rejectUnauthorized: false,
promoteLongs: true,
promoteValues: true,
@ -89,6 +90,9 @@ var Pool = function(options) {
domainsEnabled: false
}, options);
// console.log("=================================== pool options")
// console.dir(this.options)
// Identification information
this.id = _id++;
// Current reconnect retries
@ -121,6 +125,9 @@ var Pool = function(options) {
, 'sspi': new SSPI(options.bson), 'scram-sha-1': new ScramSHA1(options.bson)
}
// Contains the reconnect connection
this.reconnectConnection = null;
// Are we currently authenticating
this.authenticating = false;
this.loggingout = false;
@ -228,6 +235,9 @@ function reauthenticate(pool, connection, cb) {
function connectionFailureHandler(self, event) {
return function(err) {
// console.log("========== connectionFailureHandler :: " + event)
// console.dir(err)
if (this._connectionFailHandled) return;
this._connectionFailHandled = true;
// Destroy the connection
@ -279,6 +289,7 @@ function connectionFailureHandler(self, event) {
function attemptReconnect(self) {
return function() {
// console.log("========================= attemptReconnect")
self.emit('attemptReconnect', self);
if(self.state == DESTROYED || self.state == DESTROYING) return;
@ -289,8 +300,9 @@ function attemptReconnect(self) {
}
// If we have failure schedule a retry
function _connectionFailureHandler(self) {
function _connectionFailureHandler(self, event) {
return function() {
// console.log("========== _connectionFailureHandler :: " + event)
if (this._connectionFailHandled) return;
this._connectionFailHandled = true;
// Destroy the connection
@ -341,6 +353,8 @@ function attemptReconnect(self) {
self.retriesLeft = self.options.reconnectTries;
// Push to available connections
self.availableConnections.push(connection);
// Set the reconnectConnection to null
self.reconnectConnection = null;
// Emit reconnect event
self.emit('reconnect', self);
// Trigger execute to start everything up again
@ -350,16 +364,16 @@ function attemptReconnect(self) {
}
// Create a connection
var connection = new Connection(messageHandler(self), self.options);
self.reconnectConnection = new Connection(messageHandler(self), self.options);
// Add handlers
connection.on('close', _connectionFailureHandler(self, 'close'));
connection.on('error', _connectionFailureHandler(self, 'error'));
connection.on('timeout', _connectionFailureHandler(self, 'timeout'));
connection.on('parseError', _connectionFailureHandler(self, 'parseError'));
self.reconnectConnection.on('close', _connectionFailureHandler(self, 'close'));
self.reconnectConnection.on('error', _connectionFailureHandler(self, 'error'));
self.reconnectConnection.on('timeout', _connectionFailureHandler(self, 'timeout'));
self.reconnectConnection.on('parseError', _connectionFailureHandler(self, 'parseError'));
// On connection
connection.on('connect', _connectHandler(self));
self.reconnectConnection.on('connect', _connectHandler(self));
// Attempt connection
connection.connect();
self.reconnectConnection.connect();
}
}
@ -500,14 +514,14 @@ function messageHandler(self) {
*/
Pool.prototype.socketCount = function() {
return this.availableConnections.length
+ this.inUseConnections.length
+ this.connectingConnections.length;
+ this.inUseConnections.length;
// + this.connectingConnections.length;
}
/**
* Return all pool connections
* @method
* @return {Connectio[]} The pool connections
* @return {Connection[]} The pool connections
*/
Pool.prototype.allConnections = function() {
return this.availableConnections
@ -593,6 +607,20 @@ Pool.prototype.connect = function() {
connection.once('connect', function(connection) {
if(self.state == DESTROYED || self.state == DESTROYING) return self.destroy();
// If we are in a topology, delegate the auth to it
// This is to avoid issues where we would auth against an
// arbiter
if(self.options.inTopology) {
// Set connected mode
stateTransition(self, CONNECTED);
// Move the active connection
moveConnectionBetween(connection, self.connectingConnections, self.availableConnections);
// Emit the connect event
return self.emit('connect', self);
}
// Apply any store credentials
reauthenticate(self, connection, function(err) {
if(self.state == DESTROYED || self.state == DESTROYING) return self.destroy();
@ -638,7 +666,9 @@ Pool.prototype.connect = function() {
connection.connect();
} catch(err) {
// SSL or something threw on connect
self.emit('error', err);
process.nextTick(function() {
self.emit('error', err);
});
}
}
@ -667,14 +697,20 @@ Pool.prototype.auth = function(mechanism) {
// Authenticate all live connections
function authenticateLiveConnections(self, args, cb) {
// Get the current viable connections
var connections = self.availableConnections;
var connections = self.allConnections();
// Allow nothing else to use the connections while we authenticate them
self.availableConnections = [];
self.inUseConnections = [];
self.connectingConnections = [];
var connectionsCount = connections.length;
var error = null;
// No connections available, return
if(connectionsCount == 0) return callback(null);
if(connectionsCount == 0) {
self.authenticating = false;
return callback(null);
}
// Authenticate the connections
for(var i = 0; i < connections.length; i++) {
authenticate(self, args, connections[i], function(err) {
@ -824,11 +860,35 @@ Pool.prototype.destroy = function(force) {
.concat(self.inUseConnections)
.concat(self.nonAuthenticatedConnections)
.concat(self.connectingConnections);
// Flush any remaining work items with
// an error
while(self.queue.length > 0) {
var workItem = self.queue.shift();
if(typeof workItem.cb == 'function') {
workItem.cb(new MongoError('Pool was force destroyed'));
}
}
// Destroy the topology
return destroy(self, connections);
}
// Clear out the reconnect if set
if (this.reconnectId) {
clearTimeout(this.reconnectId);
}
// If we have a reconnect connection running, close
// immediately
if (this.reconnectConnection) {
this.reconnectConnection.destroy();
}
// Wait for the operations to drain before we close the pool
function checkStatus() {
flushMonitoringOperations(self.queue);
if(self.queue.length == 0) {
// Get all the known connections
var connections = self.availableConnections
@ -846,7 +906,12 @@ Pool.prototype.destroy = function(force) {
}
destroy(self, connections);
// } else if (self.queue.length > 0 && !this.reconnectId) {
} else {
// Ensure we empty the queue
_execute(self)();
// Set timeout
setTimeout(checkStatus, 1);
}
}
@ -986,6 +1051,9 @@ function removeConnection(self, connection) {
var handlers = ["close", "message", "error", "timeout", "parseError", "connect"];
function _createConnection(self) {
if(self.state == DESTROYED || self.state == DESTROYING) {
return;
}
var connection = new Connection(messageHandler(self), self.options);
// Push the connection
@ -1039,7 +1107,7 @@ function _createConnection(self) {
return _connection.destroy();
}
// If we are authenticating at the moment
// If we are c at the moment
// Do not automatially put in available connections
// As we need to apply the credentials first
if(self.authenticating) {
@ -1094,6 +1162,13 @@ function _execute(self) {
// Block on any auth in process
waitForAuth(function() {
// New pool connections are in progress, wait them to finish
// before executing any more operation to ensure distribution of
// operations
if(self.connectingConnections.length > 0) {
return;
}
// As long as we have available connections
while(true) {
// Total availble connections
@ -1114,12 +1189,83 @@ function _execute(self) {
}
// Get a connection
var connection = self.availableConnections[self.connectionIndex++ % self.availableConnections.length];
var connection = null;
// Locate all connections that have no work
var connections = [];
// Get a list of all connections
for(var i = 0; i < self.availableConnections.length; i++) {
if(self.availableConnections[i].workItems.length == 0) {
connections.push(self.availableConnections[i]);
}
}
// No connection found that has no work on it, just pick one for pipelining
if(connections.length == 0) {
connection = self.availableConnections[self.connectionIndex++ % self.availableConnections.length];
} else {
connection = connections[self.connectionIndex++ % connections.length];
}
// Is the connection connected
if(connection.isConnected()) {
// Get the next work item
var workItem = self.queue.shift();
// If we are monitoring we need to use a connection that is not
// running another operation to avoid socket timeout changes
// affecting an existing operation
if (workItem.monitoring) {
var foundValidConnection = false;
for (var i = 0; i < self.availableConnections.length; i++) {
// If the connection is connected
// And there are no pending workItems on it
// Then we can safely use it for monitoring.
if(self.availableConnections[i].isConnected()
&& self.availableConnections[i].workItems.length == 0) {
foundValidConnection = true;
connection = self.availableConnections[i];
break;
}
}
// No safe connection found, attempt to grow the connections
// if possible and break from the loop
if(!foundValidConnection) {
// Put workItem back on the queue
self.queue.unshift(workItem);
// Attempt to grow the pool if it's not yet maxsize
if(totalConnections < self.options.size
&& self.queue.length > 0) {
// Create a new connection
_createConnection(self);
}
// Re-execute the operation
setTimeout(function() {
_execute(self)();
}, 10);
break;
}
}
// Don't execute operation until we have a full pool
if(totalConnections < self.options.size) {
// Connection has work items, then put it back on the queue
// and create a new connection
if(connection.workItems.length > 0) {
// Lets put the workItem back on the list
self.queue.unshift(workItem);
// Create a new connection
_createConnection(self);
// Break from the loop
break;
}
}
// Get actual binary commands
var buffer = workItem.buffer;
@ -1143,24 +1289,28 @@ function _execute(self) {
connection.setSocketTimeout(workItem.socketTimeout);
}
// Capture if write was successful
var writeSuccessful = true;
// Put operation on the wire
if(Array.isArray(buffer)) {
for(var i = 0; i < buffer.length; i++) {
connection.write(buffer[i])
writeSuccessful = connection.write(buffer[i])
}
} else {
connection.write(buffer);
writeSuccessful = connection.write(buffer);
}
if(workItem.immediateRelease && self.authenticating) {
if(writeSuccessful && workItem.immediateRelease && self.authenticating) {
removeConnection(self, connection);
self.nonAuthenticatedConnections.push(connection);
}
// Have we not reached the max connection size yet
if(totalConnections < self.options.size
&& self.queue.length > 0) {
// Create a new connection
_createConnection(self);
} else if(writeSuccessful === false) {
// If write not successful put back on queue
self.queue.unshift(workItem);
// Remove the disconnected connection
removeConnection(self, connection);
// Flush any monitoring operations in the queue, failing fast
flushMonitoringOperations(self.queue);
}
} else {
// Remove the disconnected connection

View file

@ -1,6 +1,7 @@
"use strict";
var f = require('util').format;
var f = require('util').format,
require_optional = require('require_optional');
// Set property function
var setProperty = function(obj, prop, flag, values) {
@ -62,8 +63,24 @@ var debugOptions = function(debugFields, options) {
return finaloptions;
}
var retrieveBSON = function() {
var BSON = require('bson');
BSON.native = false;
try {
var optionalBSON = require_optional('bson-ext');
if(optionalBSON) {
optionalBSON.native = true;
return optionalBSON;
}
} catch(err) {}
return BSON;
}
exports.setProperty = setProperty;
exports.getProperty = getProperty;
exports.getSingleProperty = getSingleProperty;
exports.copy = copy;
exports.debugOptions = debugOptions;
exports.retrieveBSON = retrieveBSON;

View file

@ -1,10 +1,13 @@
"use strict";
var Long = require('bson').Long
, Logger = require('./connection/logger')
var Logger = require('./connection/logger')
, retrieveBSON = require('./connection/utils').retrieveBSON
, MongoError = require('./error')
, f = require('util').format;
var BSON = retrieveBSON(),
Long = BSON.Long;
/**
* This is a cursor results callback
*
@ -279,7 +282,6 @@ Cursor.prototype._find = function(callback) {
if(typeof self.cursorState.promoteBuffers == 'boolean') {
queryOptions.promoteBuffers = self.cursorState.promoteBuffers;
}
// Write the initial command out
self.server.s.pool.write(self.query, queryOptions, queryCallback);
}
@ -532,12 +534,21 @@ var nextFunction = function(self, callback) {
if(!self.cursorState.init) {
// Topology is not connected, save the call in the provided store to be
// Executed at some point when the handler deems it's reconnected
if(!self.topology.isConnected(self.options) && self.disconnectHandler != null) {
if (self.topology.isDestroyed()) {
// Topology was destroyed, so don't try to wait for it to reconnect
return callback(new MongoError('Topology was destroyed'));
if(!self.topology.isConnected(self.options)) {
// Only need this for single server, because repl sets and mongos
// will always continue trying to reconnect
if (self.topology._type === 'server' && !self.topology.s.options.reconnect) {
// Reconnect is disabled, so we'll never reconnect
return callback(new MongoError('no connection available'));
}
if (self.disconnectHandler != null) {
if (self.topology.isDestroyed()) {
// Topology was destroyed, so don't try to wait for it to reconnect
return callback(new MongoError('Topology was destroyed'));
}
return self.disconnectHandler.addObjectAndMethod('cursor', self, 'next', [callback], callback);
}
return self.disconnectHandler.addObjectAndMethod('cursor', self, 'next', [callback], callback);
}
try {
@ -662,12 +673,12 @@ var nextFunction = function(self, callback) {
var doc = self.cursorState.documents[self.cursorState.cursorIndex++];
// Doc overflow
if(doc.$err) {
if(!doc || doc.$err) {
// Ensure we kill the cursor on the server
self.kill();
// Set cursor in dead and notified state
return setCursorDeadAndNotified(self, function() {
handleCallback(callback, new MongoError(doc.$err));
handleCallback(callback, new MongoError(doc ? doc.$err : undefined));
});
}

View file

@ -3,15 +3,20 @@
var inherits = require('util').inherits,
f = require('util').format,
EventEmitter = require('events').EventEmitter,
BSON = require('bson').native().BSON,
BasicCursor = require('../cursor'),
Logger = require('../connection/logger'),
retrieveBSON = require('../connection/utils').retrieveBSON,
MongoError = require('../error'),
Server = require('./server'),
assign = require('./shared').assign,
assign = require('../utils').assign,
clone = require('./shared').clone,
sdam = require('./shared'),
diff = require('./shared').diff,
cloneOptions = require('./shared').cloneOptions,
createClientInfo = require('./shared').createClientInfo;
var BSON = retrieveBSON();
/**
* @fileOverview The **Mongos** class is a class that represents a Mongos Proxy topology and is
* used to construct connections.
@ -43,13 +48,15 @@ var MongoCR = require('../auth/mongocr')
var DISCONNECTED = 'disconnected';
var CONNECTING = 'connecting';
var CONNECTED = 'connected';
var UNREFERENCED = 'unreferenced';
var DESTROYED = 'destroyed';
function stateTransition(self, newState) {
var legalTransitions = {
'disconnected': [CONNECTING, DESTROYED, DISCONNECTED],
'connecting': [CONNECTING, DESTROYED, CONNECTED, DISCONNECTED],
'connected': [CONNECTED, DISCONNECTED, DESTROYED],
'connected': [CONNECTED, DISCONNECTED, DESTROYED, UNREFERENCED],
'unreferenced': [UNREFERENCED, DESTROYED],
'destroyed': [DESTROYED]
}
@ -85,6 +92,7 @@ var handlers = ['connect', 'close', 'error', 'timeout', 'parseError'];
* @param {boolean} [options.ssl=false] Use SSL for connection
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {Buffer} [options.ca] SSL Certificate store binary buffer
* @param {Buffer} [options.crl] SSL Certificate revocation store binary buffer
* @param {Buffer} [options.cert] SSL Certificate binary buffer
* @param {Buffer} [options.key] SSL Key file binary buffer
* @param {string} [options.passphrase] SSL Certificate pass phrase
@ -108,6 +116,8 @@ var handlers = ['connect', 'close', 'error', 'timeout', 'parseError'];
* @fires Mongos#topologyOpening
* @fires Mongos#topologyClosed
* @fires Mongos#topologyDescriptionChanged
* @property {string} type the topology type.
* @property {string} parserType the parser type used (c++ or js).
*/
var Mongos = function(seedlist, options) {
options = options || {};
@ -119,7 +129,9 @@ var Mongos = function(seedlist, options) {
this.s = {
options: assign({}, options),
// BSON instance
bson: options.bson || new BSON(),
bson: options.bson || new BSON([BSON.Binary, BSON.Code, BSON.DBRef, BSON.Decimal128,
BSON.Double, BSON.Int32, BSON.Long, BSON.Map, BSON.MaxKey, BSON.MinKey,
BSON.ObjectId, BSON.BSONRegExp, BSON.Symbol, BSON.Timestamp]),
// Factory overrides
Cursor: options.cursorFactory || BasicCursor,
// Logger instance
@ -139,7 +151,9 @@ var Mongos = function(seedlist, options) {
// localThresholdMS
localThresholdMS: options.localThresholdMS || 15,
// Client info
clientInfo: createClientInfo(options)
clientInfo: createClientInfo(options),
// Authentication context
authenticationContexts: [],
}
// Set the client info
@ -179,6 +193,11 @@ var Mongos = function(seedlist, options) {
// Last ismaster
this.ismaster = null;
// Description of the Replicaset
this.topologyDescription = {
"topologyType": "Unknown", "servers": []
};
// Add event listener
EventEmitter.call(this);
}
@ -189,6 +208,12 @@ Object.defineProperty(Mongos.prototype, 'type', {
enumerable:true, get: function() { return 'mongos'; }
});
Object.defineProperty(Mongos.prototype, 'parserType', {
enumerable:true, get: function() {
return BSON.native ? "c++" : "js";
}
});
/**
* Emit event if it exists
* @method
@ -219,6 +244,10 @@ Mongos.prototype.connect = function(options) {
}));
});
servers.forEach(function(server) {
server.on('serverDescriptionChanged', function(event) { self.emit('serverDescriptionChanged', event); });
});
// Emit the topology opening event
emitSDAMEvent(this, 'topologyOpening', { topologyId: this.id });
@ -231,15 +260,26 @@ function handleEvent(self) {
if(self.state == DESTROYED) return;
// Move to list of disconnectedProxies
moveServerFrom(self.connectedProxies, self.disconnectedProxies, this);
// Emit the initial topology
emitTopologyDescriptionChanged(self);
// Emit the left signal
self.emit('left', 'mongos', this);
// Emit the sdam event
self.emit('serverClosed', {
topologyId: self.id,
address: this.name
});
}
}
function handleInitialConnectEvent(self, event) {
return function() {
var _this = this;
// Destroy the instance
if(self.state == DESTROYED) {
// Emit the initial topology
emitTopologyDescriptionChanged(self);
// Move from connectingProxies
moveServerFrom(self.connectingProxies, self.disconnectedProxies, this);
return this.destroy();
@ -247,56 +287,61 @@ function handleInitialConnectEvent(self, event) {
// Check the type of server
if(event == 'connect') {
// Get last known ismaster
self.ismaster = this.lastIsMaster();
// Do we have authentication contexts that need to be applied
applyAuthenticationContexts(self, _this, function() {
// Get last known ismaster
self.ismaster = _this.lastIsMaster();
// Is this not a proxy, remove t
if(self.ismaster.msg == 'isdbgrid') {
// Add to the connectd list
for(var i = 0; i < self.connectedProxies.length; i++) {
if(self.connectedProxies[i].name == this.name) {
// Move from connectingProxies
moveServerFrom(self.connectingProxies, self.disconnectedProxies, this);
this.destroy();
return self.emit('failed', this);
}
}
// Remove the handlers
for(i = 0; i < handlers.length; i++) {
this.removeAllListeners(handlers[i]);
}
// Add stable state handlers
this.on('error', handleEvent(self, 'error'));
this.on('close', handleEvent(self, 'close'));
this.on('timeout', handleEvent(self, 'timeout'));
this.on('parseError', handleEvent(self, 'parseError'));
// Move from connecting proxies connected
moveServerFrom(self.connectingProxies, self.connectedProxies, this);
// Emit the joined event
self.emit('joined', 'mongos', this);
} else {
// Print warning if we did not find a mongos proxy
if(self.s.logger.isWarn()) {
var message = 'expected mongos proxy, but found replicaset member mongod for server %s';
// We have a standalone server
if(!self.ismaster.hosts) {
message = 'expected mongos proxy, but found standalone mongod for server %s';
// Is this not a proxy, remove t
if(self.ismaster.msg == 'isdbgrid') {
// Add to the connectd list
for(var i = 0; i < self.connectedProxies.length; i++) {
if(self.connectedProxies[i].name == _this.name) {
// Move from connectingProxies
moveServerFrom(self.connectingProxies, self.disconnectedProxies, _this);
// Emit the initial topology
emitTopologyDescriptionChanged(self);
_this.destroy();
return self.emit('failed', _this);
}
}
self.s.logger.warn(f(message, this.name));
}
// Remove the handlers
for(i = 0; i < handlers.length; i++) {
_this.removeAllListeners(handlers[i]);
}
// This is not a mongos proxy, remove it completely
removeProxyFrom(self.connectingProxies, this);
// Emit the left event
self.emit('left', 'server', this);
// Emit failed event
self.emit('failed', this);
}
// Add stable state handlers
_this.on('error', handleEvent(self, 'error'));
_this.on('close', handleEvent(self, 'close'));
_this.on('timeout', handleEvent(self, 'timeout'));
_this.on('parseError', handleEvent(self, 'parseError'));
// Move from connecting proxies connected
moveServerFrom(self.connectingProxies, self.connectedProxies, _this);
// Emit the joined event
self.emit('joined', 'mongos', _this);
} else {
// Print warning if we did not find a mongos proxy
if(self.s.logger.isWarn()) {
var message = 'expected mongos proxy, but found replicaset member mongod for server %s';
// We have a standalone server
if(!self.ismaster.hosts) {
message = 'expected mongos proxy, but found standalone mongod for server %s';
}
self.s.logger.warn(f(message, _this.name));
}
// This is not a mongos proxy, remove it completely
removeProxyFrom(self.connectingProxies, _this);
// Emit the left event
self.emit('left', 'server', _this);
// Emit failed event
self.emit('failed', _this);
}
});
} else {
moveServerFrom(self.connectingProxies, self.disconnectedProxies, this);
// Emit the left event
@ -305,6 +350,9 @@ function handleInitialConnectEvent(self, event) {
self.emit('failed', this);
}
// Emit the initial topology
emitTopologyDescriptionChanged(self);
// Trigger topologyMonitor
if(self.connectingProxies.length == 0) {
// Emit connected if we are connected
@ -341,16 +389,21 @@ function connectProxies(self, servers) {
function connect(server, timeoutInterval) {
setTimeout(function() {
// Emit opening server event
self.emit('serverOpening', {
topologyId: self.id,
address: server.name
});
// Emit the initial topology
emitTopologyDescriptionChanged(self);
// Add event handlers
server.once('close', handleInitialConnectEvent(self, 'close'));
server.once('timeout', handleInitialConnectEvent(self, 'timeout'));
server.once('parseError', handleInitialConnectEvent(self, 'parseError'));
server.once('error', handleInitialConnectEvent(self, 'error'));
server.once('connect', handleInitialConnectEvent(self, 'connect'));
// SDAM Monitoring events
server.on('serverOpening', function(e) { self.emit('serverOpening', e); });
server.on('serverDescriptionChanged', function(e) { self.emit('serverDescriptionChanged', e); });
server.on('serverClosed', function(e) { self.emit('serverClosed', e); });
// Start connection
server.connect(self.s.connectOptions);
}, timeoutInterval);
@ -431,33 +484,39 @@ function reconnectProxies(self, proxies, callback) {
count = count - 1;
// Destroyed
if(self.state == DESTROYED) {
if(self.state == DESTROYED || self.state == UNREFERENCED) {
moveServerFrom(self.connectingProxies, self.disconnectedProxies, _self);
// Return destroy
return this.destroy();
}
if(event == 'connect' && !self.authenticating) {
// Destroyed
if(self.state == DESTROYED) {
moveServerFrom(self.connectingProxies, self.disconnectedProxies, _self);
return _self.destroy();
}
// Do we have authentication contexts that need to be applied
applyAuthenticationContexts(self, _self, function() {
// Destroyed
if(self.state == DESTROYED || self.state == UNREFERENCED) {
moveServerFrom(self.connectingProxies, self.disconnectedProxies, _self);
return _self.destroy();
}
// Remove the handlers
for(var i = 0; i < handlers.length; i++) {
_self.removeAllListeners(handlers[i]);
}
// Remove the handlers
for(var i = 0; i < handlers.length; i++) {
_self.removeAllListeners(handlers[i]);
}
// Add stable state handlers
_self.on('error', handleEvent(self, 'error'));
_self.on('close', handleEvent(self, 'close'));
_self.on('timeout', handleEvent(self, 'timeout'));
_self.on('parseError', handleEvent(self, 'parseError'));
// Add stable state handlers
_self.on('error', handleEvent(self, 'error'));
_self.on('close', handleEvent(self, 'close'));
_self.on('timeout', handleEvent(self, 'timeout'));
_self.on('parseError', handleEvent(self, 'parseError'));
// Move to the connected servers
moveServerFrom(self.disconnectedProxies, self.connectedProxies, _self);
// Emit joined event
self.emit('joined', 'mongos', _self);
// Move to the connected servers
moveServerFrom(self.disconnectedProxies, self.connectedProxies, _self);
// Emit topology Change
emitTopologyDescriptionChanged(self);
// Emit joined event
self.emit('joined', 'mongos', _self);
});
} else if(event == 'connect' && self.authenticating) {
// Move from connectingProxies
moveServerFrom(self.connectingProxies, self.disconnectedProxies, _self);
@ -480,7 +539,7 @@ function reconnectProxies(self, proxies, callback) {
function execute(_server, i) {
setTimeout(function() {
// Destroyed
if(self.state == DESTROYED) {
if(self.state == DESTROYED || self.state == UNREFERENCED) {
return;
}
@ -494,6 +553,15 @@ function reconnectProxies(self, proxies, callback) {
clientInfo: clone(self.s.clientInfo)
}));
// Relay the server description change
server.on('serverDescriptionChanged', function(event) { self.emit('serverDescriptionChanged', event); });
// Emit opening server event
self.emit('serverOpening', {
topologyId: server.s.topologyId != -1 ? server.s.topologyId : self.id,
address: server.name
});
// Add temp handlers
server.once('connect', _handleEvent(self, 'connect'));
server.once('close', _handleEvent(self, 'close'));
@ -501,10 +569,7 @@ function reconnectProxies(self, proxies, callback) {
server.once('error', _handleEvent(self, 'error'));
server.once('parseError', _handleEvent(self, 'parseError'));
// SDAM Monitoring events
server.on('serverOpening', function(e) { self.emit('serverOpening', e); });
server.on('serverDescriptionChanged', function(e) { self.emit('serverDescriptionChanged', e); });
server.on('serverClosed', function(e) { self.emit('serverClosed', e); });
// Connect to proxy
server.connect(self.s.connectOptions);
}, i);
}
@ -515,12 +580,41 @@ function reconnectProxies(self, proxies, callback) {
}
}
function applyAuthenticationContexts(self, server, callback) {
if(self.s.authenticationContexts.length == 0) {
return callback();
}
// Copy contexts to ensure no modificiation in the middle of
// auth process.
var authContexts = self.s.authenticationContexts.slice(0);
// Apply one of the contexts
function applyAuth(authContexts, server, callback) {
if(authContexts.length == 0) return callback();
// Get the first auth context
var authContext = authContexts.shift();
// Copy the params
var customAuthContext = authContext.slice(0);
// Push our callback handler
customAuthContext.push(function(err) {
applyAuth(authContexts, server, callback);
});
// Attempt authentication
server.auth.apply(server, customAuthContext)
}
// Apply all auth contexts
applyAuth(authContexts, server, callback);
}
function topologyMonitor(self, options) {
options = options || {};
// Set momitoring timeout
self.haTimeoutId = setTimeout(function() {
if(self.state == DESTROYED) return;
if(self.state == DESTROYED || self.state == UNREFERENCED) return;
// If we have a primary and a disconnect handler, execute
// buffered operations
if(self.isConnected() && self.s.disconnectHandler) {
@ -547,7 +641,7 @@ function topologyMonitor(self, options) {
monitoring: true,
socketTimeout: self.s.options.connectionTimeout || 2000,
}, function(err, r) {
if(self.state == DESTROYED) {
if(self.state == DESTROYED || self.state == UNREFERENCED) {
// Move from connectingProxies
moveServerFrom(self.connectedProxies, self.disconnectedProxies, _server);
_server.destroy();
@ -587,7 +681,7 @@ function topologyMonitor(self, options) {
// Attempt to connect to any unknown servers
return reconnectProxies(self, self.disconnectedProxies, function() {
if(self.state == DESTROYED) return;
if(self.state == DESTROYED || self.state == UNREFERENCED) return;
// Are we connected ? emit connect event
if(self.state == CONNECTING && options.firstConnect) {
@ -611,11 +705,11 @@ function topologyMonitor(self, options) {
count = count - 1;
if(count == 0) {
if(self.state == DESTROYED) return;
if(self.state == DESTROYED || self.state == UNREFERENCED) return;
// Attempt to connect to any unknown servers
reconnectProxies(self, self.disconnectedProxies, function() {
if(self.state == DESTROYED) return;
if(self.state == DESTROYED || self.state == UNREFERENCED) return;
// Perform topology monitor
topologyMonitor(self);
});
@ -640,7 +734,7 @@ Mongos.prototype.lastIsMaster = function() {
*/
Mongos.prototype.unref = function() {
// Transition state
stateTransition(this, DISCONNECTED);
stateTransition(this, UNREFERENCED);
// Get all proxies
var proxies = this.connectedProxies.concat(this.connectingProxies);
proxies.forEach(function(x) {
@ -656,18 +750,32 @@ Mongos.prototype.unref = function() {
* @method
*/
Mongos.prototype.destroy = function(options) {
var self = this;
// Transition state
stateTransition(this, DESTROYED);
// Get all proxies
var proxies = this.connectedProxies.concat(this.connectingProxies);
// Clear out any monitoring process
if(this.haTimeoutId) clearTimeout(this.haTimeoutId);
// Clear out authentication contexts
this.s.authenticationContexts = [];
// Destroy all connecting servers
proxies.forEach(function(x) {
// Emit the sdam event
self.emit('serverClosed', {
topologyId: self.id,
address: x.name
});
// Destroy the server
x.destroy(options);
// Move to list of disconnectedProxies
moveServerFrom(self.connectedProxies, self.disconnectedProxies, x);
});
// Emit the final topology change
emitTopologyDescriptionChanged(self);
// Emit toplogy closing event
emitSDAMEvent(this, 'topologyClosed', { topologyId: this.id });
}
@ -824,8 +932,12 @@ Mongos.prototype.command = function(ns, cmd, options, callback) {
return callback(new MongoError('no mongos proxy available'));
}
// Cloned options
var clonedOptions = cloneOptions(options);
clonedOptions.topology = self;
// Execute the command
server.command(ns, cmd, options, callback);
server.command(ns, cmd, clonedOptions, callback);
}
/**
@ -859,6 +971,7 @@ Mongos.prototype.auth = function(mechanism, db) {
var self = this;
var args = Array.prototype.slice.call(arguments, 2);
var callback = args.pop();
var currentContextIndex = 0;
// If we don't have the mechanism fail
if(this.authProviders[mechanism] == null && mechanism != 'default') {
@ -904,9 +1017,14 @@ Mongos.prototype.auth = function(mechanism, db) {
self.authenticating = false;
// Return the auth error
if(errors.length) return callback(MongoError.create({
message: 'authentication fail', errors: errors
}), false);
if(errors.length) {
// Remove the entry from the stored authentication contexts
self.s.authenticationContexts.splice(currentContextIndex, 0);
// Return error
return callback(MongoError.create({
message: 'authentication fail', errors: errors
}), false);
}
// Successfully authenticated session
callback(null, self);
@ -919,6 +1037,11 @@ Mongos.prototype.auth = function(mechanism, db) {
}
}
// Save current context index
currentContextIndex = this.s.authenticationContexts.length;
// Store the auth context and return the last index
this.s.authenticationContexts.push([mechanism, db].concat(args.slice(0)));
// Get total count
var count = servers.length;
// Authenticate against all servers
@ -985,7 +1108,6 @@ Mongos.prototype.logout = function(dbName, callback) {
/**
* Get server
* @method
* @param {ReadPreference} [options.readPreference] Specify read preference if command supports it
* @return {Server}
*/
Mongos.prototype.getServer = function() {
@ -994,6 +1116,16 @@ Mongos.prototype.getServer = function() {
return server;
}
/**
* Get a direct connection
* @method
* @return {Connection}
*/
Mongos.prototype.getConnection = function() {
var server = this.getServer();
if(server) return server.getConnection();
}
/**
* All raw connections
* @method
@ -1009,6 +1141,60 @@ Mongos.prototype.connections = function() {
return connections;
}
function emitTopologyDescriptionChanged(self) {
if(self.listeners('topologyDescriptionChanged').length > 0) {
var topology = 'Unknown';
var setName = self.setName;
if(self.connectedProxies.length > 0) {
topology = 'Sharded';
}
// Generate description
var description = {
topologyType: topology,
servers: []
}
// All proxies
var proxies = self.disconnectedProxies
.concat(self.connectingProxies);
// Add all the disconnected proxies
description.servers = description.servers.concat(proxies.map(function(x) {
var description = x.getDescription();
description.type = 'Unknown';
return description;
}));
// Add all the connected proxies
description.servers = description.servers.concat(self.connectedProxies.map(function(x) {
var description = x.getDescription();
description.type = 'Mongos';
return description;
}));
// Get the diff
var diffResult = diff(self.topologyDescription, description);
// Create the result
var result = {
topologyId: self.id,
previousDescription: self.topologyDescription,
newDescription: description,
diff: diffResult
};
// Emit the topologyDescription change
if(diffResult.servers.length > 0) {
self.emit('topologyDescriptionChanged', result);
}
// Set the new description
self.topologyDescription = description;
}
}
/**
* A mongos connect event, used to verify that the connection is up and running
*

File diff suppressed because it is too large Load diff

View file

@ -2,6 +2,7 @@
var inherits = require('util').inherits,
f = require('util').format,
diff = require('./shared').diff,
EventEmitter = require('events').EventEmitter,
Logger = require('../connection/logger'),
ReadPreference = require('./read_preference'),
@ -70,6 +71,10 @@ ReplSetState.prototype.hasPrimaryAndSecondary = function() {
return this.primary != null && this.secondaries.length > 0;
}
ReplSetState.prototype.hasPrimaryOrSecondary = function() {
return this.hasPrimary() || this.hasSecondary();
}
ReplSetState.prototype.hasPrimary = function() {
return this.primary != null;
}
@ -78,6 +83,18 @@ ReplSetState.prototype.hasSecondary = function() {
return this.secondaries.length > 0;
}
ReplSetState.prototype.get = function(host) {
var servers = this.allServers();
for(var i = 0; i < servers.length; i++) {
if(servers[i].name.toLowerCase() === host.toLowerCase()) {
return servers[i];
}
}
return null;
}
ReplSetState.prototype.allServers = function(options) {
options = options || {};
var servers = this.primary ? [this.primary] : [];
@ -101,11 +118,17 @@ ReplSetState.prototype.destroy = function(options) {
this.ghosts = [];
this.unknownServers = [];
this.set = {};
this.primary = null;
// Emit the topology changed
emitTopologyDescriptionChanged(this);
}
ReplSetState.prototype.remove = function(server, options) {
options = options || {};
// Get the server name and lowerCase it
var serverName = server.name.toLowerCase();
// Only remove if the current server is not connected
var servers = this.primary ? [this.primary] : [];
servers = servers.concat(this.secondaries);
@ -114,17 +137,20 @@ ReplSetState.prototype.remove = function(server, options) {
// Check if it's active and this is just a failed connection attempt
for(var i = 0; i < servers.length; i++) {
if(!options.force && servers[i].equals(server) && servers[i].isConnected && servers[i].isConnected()) {
if(!options.force
&& servers[i].equals(server)
&& servers[i].isConnected
&& servers[i].isConnected()) {
return;
}
}
// If we have it in the set remove it
if(this.set[server.name.toLowerCase()]) {
this.set[server.name.toLowerCase()].type = ServerType.Unknown;
this.set[server.name.toLowerCase()].electionId = null;
this.set[server.name.toLowerCase()].setName = null;
this.set[server.name.toLowerCase()].setVersion = null;
if(this.set[serverName]) {
this.set[serverName].type = ServerType.Unknown;
this.set[serverName].electionId = null;
this.set[serverName].setName = null;
this.set[serverName].setVersion = null;
}
// Remove type
@ -144,6 +170,9 @@ ReplSetState.prototype.remove = function(server, options) {
removeFrom(server, this.ghosts);
removeFrom(server, this.unknownServers);
// Push to unknownServers
this.unknownServers.push(serverName);
// Do we have a removeType
if(removeType) {
this.emit('left', removeType, server);
@ -155,6 +184,9 @@ ReplSetState.prototype.update = function(server) {
// Get the current ismaster
var ismaster = server.lastIsMaster();
// Get the server name and lowerCase it
var serverName = server.name.toLowerCase();
//
// Add any hosts
//
@ -163,17 +195,18 @@ ReplSetState.prototype.update = function(server) {
var hosts = Array.isArray(ismaster.hosts) ? ismaster.hosts : [];
hosts = hosts.concat(Array.isArray(ismaster.arbiters) ? ismaster.arbiters : []);
hosts = hosts.concat(Array.isArray(ismaster.passives) ? ismaster.passives : []);
hosts = hosts.map(function(s) { return s.toLowerCase() });
// Add all hosts as unknownServers
for(var i = 0; i < hosts.length; i++) {
// Add to the list of unknown server
if(this.unknownServers.indexOf(hosts[i]) == -1
&& (!this.set[hosts[i].toLowerCase()] || this.set[hosts[i].toLowerCase()].type == ServerType.Unknown)) {
this.unknownServers.push(hosts[i]);
&& (!this.set[hosts[i]] || this.set[hosts[i]].type == ServerType.Unknown)) {
this.unknownServers.push(hosts[i].toLowerCase());
}
if(!this.set[hosts[i].toLowerCase()]) {
this.set[hosts[i].toLowerCase()] = {
if(!this.set[hosts[i]]) {
this.set[hosts[i]] = {
type: ServerType.Unknown,
electionId: null,
setName: null,
@ -187,17 +220,17 @@ ReplSetState.prototype.update = function(server) {
// Unknown server
//
if(!ismaster && !inList(ismaster, server, this.unknownServers)) {
self.set[server.name.toLowerCase()] = {
self.set[serverName] = {
type: ServerType.Unknown, setVersion: null, electionId: null, setName: null
}
// Update set information about the server instance
self.set[server.name.toLowerCase()].type = ServerType.Unknown;
self.set[server.name.toLowerCase()].electionId = ismaster ? ismaster.electionId : ismaster;
self.set[server.name.toLowerCase()].setName = ismaster ? ismaster.setName : ismaster;
self.set[server.name.toLowerCase()].setVersion = ismaster ? ismaster.setVersion : ismaster;
self.set[serverName].type = ServerType.Unknown;
self.set[serverName].electionId = ismaster ? ismaster.electionId : ismaster;
self.set[serverName].setName = ismaster ? ismaster.setName : ismaster;
self.set[serverName].setVersion = ismaster ? ismaster.setVersion : ismaster;
if(self.unknownServers.indexOf(server.name) == -1) {
self.unknownServers.push(server.name);
self.unknownServers.push(serverName);
}
// Set the topology
@ -214,7 +247,7 @@ ReplSetState.prototype.update = function(server) {
// A RSOther instance
if((ismaster.setName && ismaster.hidden)
|| (ismaster.setName && !ismaster.ismaster && !ismaster.secondary && !ismaster.arbiterOnly && !ismaster.passive)) {
self.set[server.name.toLowerCase()] = {
self.set[serverName] = {
type: ServerType.RSOther, setVersion: null,
electionId: null, setName: ismaster.setName
}
@ -226,7 +259,7 @@ ReplSetState.prototype.update = function(server) {
// A RSGhost instance
if(ismaster.isreplicaset) {
self.set[server.name.toLowerCase()] = {
self.set[serverName] = {
type: ServerType.RSGhost, setVersion: null,
electionId: null, setName: null
}
@ -259,13 +292,18 @@ ReplSetState.prototype.update = function(server) {
//
// If the .me field does not match the passed in server
//
if(ismaster.me && ismaster.me != server.name) {
if(ismaster.me && ismaster.me.toLowerCase() != serverName) {
if(this.logger.isWarn()) {
this.logger.warn(f('the seedlist server was removed due to its address %s not matching its ismaster.me address %s', server.name, ismaster.me));
}
// Delete from the set
delete this.set[server.name.toLowerCase()];
delete this.set[serverName];
// Delete unknown servers
removeFrom(server, self.unknownServers);
// Destroy the instance
server.destroy();
// Set the type of topology we have
if(this.primary && !this.primary.equals(server)) {
@ -306,7 +344,6 @@ ReplSetState.prototype.update = function(server) {
// Get the electionIds
var ismasterSetVersion = server.lastIsMaster().setVersion;
// if(result == 1 || result == 0) {
if(result == 1) {
this.topologyType = TopologyType.ReplicaSetNoPrimary;
return false;
@ -323,12 +360,12 @@ ReplSetState.prototype.update = function(server) {
// Hande normalization of server names
var normalizedHosts = ismaster.hosts.map(function(x) { return x.toLowerCase() });
var locationIndex = normalizedHosts.indexOf(server.name.toLowerCase());
var locationIndex = normalizedHosts.indexOf(serverName);
// Validate that the server exists in the host list
if(locationIndex != -1) {
self.primary = server;
self.set[server.name.toLowerCase()] = {
self.set[serverName] = {
type: ServerType.RSPrimary,
setVersion: ismaster.setVersion,
electionId: ismaster.electionId,
@ -426,7 +463,7 @@ ReplSetState.prototype.update = function(server) {
// Set the new instance
self.primary = server;
// Set the set information
self.set[server.name.toLowerCase()] = {
self.set[serverName] = {
type: ServerType.RSPrimary, setVersion: ismaster.setVersion,
electionId: ismaster.electionId, setName: ismaster.setName
}
@ -463,12 +500,14 @@ ReplSetState.prototype.update = function(server) {
removeFrom(server, self.unknownServers);
// Remove primary
if(this.primary && this.primary.name == server.name) {
server.destroy();
this.primary = null;
self.emit('left', 'primary', server);
if(this.primary
&& this.primary.name.toLowerCase() == serverName) {
server.destroy();
this.primary = null;
self.emit('left', 'primary', server);
}
// Emit secondary joined replicaset
self.emit('joined', 'secondary', server);
emitTopologyDescriptionChanged(self);
return true;
@ -503,10 +542,11 @@ ReplSetState.prototype.update = function(server) {
removeFrom(server, self.unknownServers);
// Remove primary
if(this.primary && this.primary.name == server.name) {
server.destroy();
this.primary = null;
self.emit('left', 'primary', server);
if(this.primary
&& this.primary.name.toLowerCase() == serverName) {
server.destroy();
this.primary = null;
self.emit('left', 'primary', server);
}
self.emit('joined', 'secondary', server);
@ -517,7 +557,7 @@ ReplSetState.prototype.update = function(server) {
//
// Remove the primary
//
if(this.set[server.name.toLowerCase()] && this.set[server.name.toLowerCase()].type == ServerType.RSPrimary) {
if(this.set[serverName] && this.set[serverName].type == ServerType.RSPrimary) {
self.emit('left', 'primary', this.primary);
this.primary.destroy();
this.primary = null;
@ -555,7 +595,7 @@ ReplSetState.prototype.updateServerMaxStaleness = function(server, haInterval) {
}
/**
* Recalculate all the stalness values for secodaries
* Recalculate all the staleness values for secodaries
* @method
*/
ReplSetState.prototype.updateSecondariesMaxStaleness = function(haInterval) {
@ -719,8 +759,10 @@ function pickNearestMaxStalenessSeconds(self, readPreference) {
}
// Add primary to list if not a secondary read preference
if(self.primary && readPreference.preference != 'secondary') {
servers.push(self.primary);
if(self.primary
&& readPreference.preference != 'secondary'
&& readPreference.preference != 'secondaryPreferred') {
servers.push(self.primary);
}
// Add all the secondaries
@ -728,6 +770,13 @@ function pickNearestMaxStalenessSeconds(self, readPreference) {
servers.push(self.secondaries[i]);
}
// If we have a secondaryPreferred readPreference and no server add the primary
if(self.primary
&& servers.length == 0
&& readPreference.preference != 'secondaryPreferred') {
servers.push(self.primary);
}
// Filter by tags
servers = filterByTags(readPreference, servers);
@ -742,8 +791,7 @@ function pickNearestMaxStalenessSeconds(self, readPreference) {
// Sort by time
servers.sort(function(a, b) {
// return a.time > b.time;
return a.lastIsMasterMS > b.lastIsMasterMS
return a.lastIsMasterMS - b.lastIsMasterMS
});
// No servers, default to primary
@ -767,8 +815,10 @@ function pickNearest(self, readPreference) {
var servers = [];
// Add primary to list if not a secondary read preference
if(self.primary && readPreference.preference != 'secondary') {
servers.push(self.primary);
if(self.primary
&& readPreference.preference != 'secondary'
&& readPreference.preference != 'secondaryPreferred') {
servers.push(self.primary);
}
// Add all the secondaries
@ -776,13 +826,19 @@ function pickNearest(self, readPreference) {
servers.push(self.secondaries[i]);
}
// If we have a secondaryPreferred readPreference and no server add the primary
if(servers.length == 0
&& self.primary
&& readPreference.preference != 'secondaryPreferred') {
servers.push(self.primary);
}
// Filter by tags
servers = filterByTags(readPreference, servers);
// Sort by time
servers.sort(function(a, b) {
// return a.time > b.time;
return a.lastIsMasterMS > b.lastIsMasterMS
return a.lastIsMasterMS - b.lastIsMasterMS
});
// Locate lowest time (picked servers are lowest time + acceptable Latency margin)
@ -810,18 +866,20 @@ function pickNearest(self, readPreference) {
function inList(ismaster, server, list) {
for(var i = 0; i < list.length; i++) {
if(list[i].name == server.name) return true;
if(list[i] && list[i].name
&& list[i].name.toLowerCase() == server.name.toLowerCase()) return true;
}
return false;
}
function addToList(self, type, ismaster, server, list) {
var serverName = server.name.toLowerCase();
// Update set information about the server instance
self.set[server.name.toLowerCase()].type = type;
self.set[server.name.toLowerCase()].electionId = ismaster ? ismaster.electionId : ismaster;
self.set[server.name.toLowerCase()].setName = ismaster ? ismaster.setName : ismaster;
self.set[server.name.toLowerCase()].setVersion = ismaster ? ismaster.setVersion : ismaster;
self.set[serverName].type = type;
self.set[serverName].electionId = ismaster ? ismaster.electionId : ismaster;
self.set[serverName].setName = ismaster ? ismaster.setName : ismaster;
self.set[serverName].setVersion = ismaster ? ismaster.setVersion : ismaster;
// Add to the list
list.push(server);
}
@ -861,9 +919,10 @@ function removeFrom(server, list) {
if(list[i].equals && list[i].equals(server)) {
list.splice(i, 1);
return true;
} else if(typeof list[i] == 'string' && list[i] == server.name) {
list.splice(i, 1);
return true;
} else if(typeof list[i] == 'string'
&& list[i].toLowerCase() == server.name.toLowerCase()) {
list.splice(i, 1);
return true;
}
}
@ -916,57 +975,25 @@ function emitTopologyDescriptionChanged(self) {
return description;
}));
// Get the diff
var diffResult = diff(self.replicasetDescription, description);
// Create the result
var result = {
topologyId: self.id,
previousDescription: self.replicasetDescription,
newDescription: description,
diff: diff(self.replicasetDescription, description)
diff: diffResult,
};
// Emit the topologyDescription change
self.emit('topologyDescriptionChanged', result);
// if(diffResult.servers.length > 0) {
self.emit('topologyDescriptionChanged', result);
// }
// Set the new description
self.replicasetDescription = description;
}
}
function diff(previous, current) {
// Difference document
var diff = {
servers: []
}
// Previous entry
if(!previous) {
previous = { servers: [] };
}
// Got through all the servers
for(var i = 0; i < previous.servers.length; i++) {
var prevServer = previous.servers[i];
// Go through all current servers
for(var j = 0; j < current.servers.length; j++) {
var currServer = current.servers[j];
// Matching server
if(prevServer.address === currServer.address) {
// We had a change in state
if(prevServer.type != currServer.type) {
diff.servers.push({
address: prevServer.address,
from: prevServer.type,
to: currServer.type
});
}
}
}
}
// Return difference
return diff;
}
module.exports = ReplSetState;

View file

@ -3,10 +3,10 @@
var inherits = require('util').inherits,
f = require('util').format,
EventEmitter = require('events').EventEmitter,
BSON = require('bson').native().BSON,
ReadPreference = require('./read_preference'),
Logger = require('../connection/logger'),
debugOptions = require('../connection/utils').debugOptions,
retrieveBSON = require('../connection/utils').retrieveBSON,
Pool = require('../connection/pool'),
Query = require('../connection/commands').Query,
MongoError = require('../error'),
@ -15,19 +15,20 @@ var inherits = require('util').inherits,
ThreeTwoWireProtocolSupport = require('../wireprotocol/3_2_support'),
BasicCursor = require('../cursor'),
sdam = require('./shared'),
assign = require('./shared').assign,
assign = require('../utils').assign,
createClientInfo = require('./shared').createClientInfo;
// Used for filtering out fields for loggin
var debugFields = ['reconnect', 'reconnectTries', 'reconnectInterval', 'emitError', 'cursorFactory', 'host'
, 'port', 'size', 'keepAlive', 'keepAliveInitialDelay', 'noDelay', 'connectionTimeout', 'checkServerIdentity'
, 'socketTimeout', 'singleBufferSerializtion', 'ssl', 'ca', 'cert', 'key', 'rejectUnauthorized', 'promoteLongs', 'promoteValues'
, 'socketTimeout', 'singleBufferSerializtion', 'ssl', 'ca', 'crl', 'cert', 'key', 'rejectUnauthorized', 'promoteLongs', 'promoteValues'
, 'promoteBuffers', 'servername'];
// Server instance id
var id = 0;
var serverAccounting = false;
var servers = {};
var BSON = retrieveBSON();
/**
* Creates a new Server instance
@ -42,13 +43,14 @@ var servers = {};
* @param {number} options.port The server port
* @param {number} [options.size=5] Server connection pool size
* @param {boolean} [options.keepAlive=true] TCP Connection keep alive enabled
* @param {number} [options.keepAliveInitialDelay=0] Initial delay before TCP keep alive enabled
* @param {number} [options.keepAliveInitialDelay=300000] Initial delay before TCP keep alive enabled
* @param {boolean} [options.noDelay=true] TCP Connection no delay
* @param {number} [options.connectionTimeout=0] TCP Connection timeout setting
* @param {number} [options.socketTimeout=0] TCP Socket timeout setting
* @param {number} [options.connectionTimeout=30000] TCP Connection timeout setting
* @param {number} [options.socketTimeout=360000] TCP Socket timeout setting
* @param {boolean} [options.ssl=false] Use SSL for connection
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {Buffer} [options.ca] SSL Certificate store binary buffer
* @param {Buffer} [options.crl] SSL Certificate revocation store binary buffer
* @param {Buffer} [options.cert] SSL Certificate binary buffer
* @param {Buffer} [options.key] SSL Key file binary buffer
* @param {string} [options.passphrase] SSL Certificate pass phrase
@ -73,6 +75,8 @@ var servers = {};
* @fires Server#topologyOpening
* @fires Server#topologyClosed
* @fires Server#topologyDescriptionChanged
* @property {string} type the topology type.
* @property {string} parserType the parser type used (c++ or js).
*/
var Server = function(options) {
options = options || {};
@ -92,7 +96,9 @@ var Server = function(options) {
// Factory overrides
Cursor: options.cursorFactory || BasicCursor,
// BSON instance
bson: options.bson || new BSON(),
bson: options.bson || new BSON([BSON.Binary, BSON.Code, BSON.DBRef, BSON.Decimal128,
BSON.Double, BSON.Int32, BSON.Long, BSON.Map, BSON.MaxKey, BSON.MinKey,
BSON.ObjectId, BSON.BSONRegExp, BSON.Symbol, BSON.Timestamp]),
// Pool
pool: null,
// Disconnect handler
@ -140,6 +146,12 @@ Object.defineProperty(Server.prototype, 'type', {
enumerable:true, get: function() { return this._type; }
});
Object.defineProperty(Server.prototype, 'parserType', {
enumerable:true, get: function() {
return BSON.native ? "c++" : "js";
}
});
Server.enableServerAccounting = function() {
serverAccounting = true;
servers = {};
@ -176,7 +188,7 @@ function configureWireProtocolHandler(self, ismaster) {
function disconnectHandler(self, type, ns, cmd, options, callback) {
// Topology is not connected, save the call in the provided store to be
// Executed at some point when the handler deems it's reconnected
if(!self.s.pool.isConnected() && self.s.disconnectHandler != null && !options.monitoring) {
if(!self.s.pool.isConnected() && self.s.options.reconnect && self.s.disconnectHandler != null && !options.monitoring) {
self.s.disconnectHandler.add(type, ns, cmd, options, callback);
return true;
}
@ -201,8 +213,12 @@ function monitoringProcess(self) {
var query = new Query(self.s.bson, 'admin.$cmd', {ismaster:true}, queryOptions);
// Get start time
var start = new Date().getTime();
// Execute the ismaster query
self.s.pool.write(query, function(err, result) {
self.s.pool.write(query, {
socketTimeout: (typeof self.s.options.connectionTimeout !== 'number') ? 2000 : self.s.options.connectionTimeout,
monitoring: true,
}, function(err, result) {
// Set initial lastIsMasterMS
self.lastIsMasterMS = new Date().getTime() - start;
if(self.s.pool.isDestroyed()) return;
@ -235,7 +251,9 @@ var eventHandler = function(self, event) {
// Get start time
var start = new Date().getTime();
// Execute the ismaster query
self.s.pool.write(query, function(err, result) {
self.s.pool.write(query, {
socketTimeout: self.s.options.connectionTimeout || 2000,
}, function(err, result) {
// Set initial lastIsMasterMS
self.lastIsMasterMS = new Date().getTime() - start;
if(err) {
@ -263,13 +281,15 @@ var eventHandler = function(self, event) {
// Emit server description changed if something listening
sdam.emitServerDescriptionChanged(self, {
address: self.name, arbiters: [], hosts: [], passives: [], type: !self.s.inTopology ? 'Standalone' : sdam.getTopologyType(self)
address: self.name, arbiters: [], hosts: [], passives: [], type: sdam.getTopologyType(self)
});
// Emit topology description changed if something listening
sdam.emitTopologyDescriptionChanged(self, {
topologyType: 'Single', servers: [{address: self.name, arbiters: [], hosts: [], passives: [], type: 'Standalone'}]
});
if(!self.s.inTopology) {
// Emit topology description changed if something listening
sdam.emitTopologyDescriptionChanged(self, {
topologyType: 'Single', servers: [{address: self.name, arbiters: [], hosts: [], passives: [], type: sdam.getTopologyType(self)}]
});
}
// Log the ismaster if available
if(self.s.logger.isInfo()) {
@ -292,6 +312,13 @@ var eventHandler = function(self, event) {
delete servers[self.id];
}
if (event === 'close') {
// Closing emits a server description changed event going to unknown.
sdam.emitServerDescriptionChanged(self, {
address: self.name, arbiters: [], hosts: [], passives: [], type: 'Unknown'
});
}
// Reconnect failed return error
if(event == 'reconnectFailed') {
self.emit('reconnectFailed', err);
@ -306,11 +333,16 @@ var eventHandler = function(self, event) {
// On first connect fail
if(self.s.pool.state == 'disconnected' && self.initalConnect && ['close', 'timeout', 'error', 'parseError'].indexOf(event) != -1) {
self.initalConnect = false;
return self.emit('error', new MongoError(f('failed to connect to server [%s] on first connect', self.name)));
return self.emit('error', new MongoError(f('failed to connect to server [%s] on first connect [%s]', self.name, err)));
}
// Reconnect event, emit the server
if(event == 'reconnect') {
// Reconnecting emits a server description changed event going from unknown to the
// current server type.
sdam.emitServerDescriptionChanged(self, {
address: self.name, arbiters: [], hosts: [], passives: [], type: sdam.getTopologyType(self)
});
return self.emit(event, self);
}
@ -334,7 +366,7 @@ Server.prototype.connect = function(options) {
// Do not allow connect to be called on anything that's not disconnected
if(self.s.pool && !self.s.pool.isDisconnected() && !self.s.pool.isDestroyed()) {
throw MongoError.create(f('server instance in invalid state %s', self.s.state));
throw MongoError.create(f('server instance in invalid state %s', self.s.pool.state));
}
// Create a pool
@ -445,6 +477,7 @@ function basicReadValidations(self, options) {
* @param {object} cmd The command hash
* @param {ReadPreference} [options.readPreference] Specify read preference if command supports it
* @param {Boolean} [options.serializeFunctions=false] Specify if functions on an object should be serialized.
* @param {Boolean} [options.checkKeys=false] Specify if the bson parser should validate keys.
* @param {Boolean} [options.ignoreUndefined=false] Specify if the BSON serializer should ignore undefined fields.
* @param {Boolean} [options.fullResult=false] Return the full envelope instead of just the result document.
* @param {opResultCallback} callback A callback function
@ -455,6 +488,9 @@ Server.prototype.command = function(ns, cmd, options, callback) {
var result = basicReadValidations(self, options);
if(result) return callback(result);
// Clone the options
options = assign({}, options, { wireProtocolCommand: false });
// Debug log
if(self.s.logger.isDebug()) self.s.logger.debug(f('executing command [%s] against %s', JSON.stringify({
ns: ns, cmd: cmd, options: debugOptions(debugFields, options)
@ -477,8 +513,10 @@ Server.prototype.command = function(ns, cmd, options, callback) {
ignoreUndefined: typeof options.ignoreUndefined == 'boolean' ? options.ignoreUndefined : false
};
// Create a query instance
var query = new Query(self.s.bson, ns, cmd, queryOptions);
// Are we executing against a specific topology
var topology = options.topology || {};
// Create the query object
var query = self.wireProtocolHandler.command(self.s.bson, ns, cmd, {}, topology, options);
// Set slave OK of the query
query.slaveOk = options.readPreference ? options.readPreference.slaveOk() : false;
@ -666,8 +704,8 @@ Server.prototype.auth = function(mechanism, db) {
* @return {boolean}
*/
Server.prototype.equals = function(server) {
if(typeof server == 'string') return this.name == server;
if(server.name) return this.name == server.name;
if(typeof server == 'string') return this.name.toLowerCase() == server.toLowerCase();
if(server.name) return this.name.toLowerCase() == server.name.toLowerCase();
return false;
}
@ -719,6 +757,9 @@ Server.prototype.destroy = function(options) {
clearTimeout(this.monitoringProcessId);
}
// No pool, return
if(!self.s.pool) return;
// Emit close event
if(options.emitClose) {
self.emit('close', self);

View file

@ -14,7 +14,7 @@ function emitSDAMEvent(self, event, description) {
}
// Get package.json variable
var driverVersion = require(__dirname + '/../../package.json').version;
var driverVersion = require('../../package.json').version;
var nodejsversion = f('Node.js %s, %s', process.version, os.endianness());
var type = os.type();
var name = process.platform;
@ -123,8 +123,8 @@ var getTopologyType = function(self, ismaster) {
}
if(!ismaster) return 'Unknown';
if(ismaster.ismaster && !ismaster.hosts) return 'Standalone';
if(ismaster.ismaster && ismaster.msg == 'isdbgrid') return 'Mongos';
if(ismaster.ismaster && !ismaster.hosts) return 'Standalone';
if(ismaster.ismaster) return 'RSPrimary';
if(ismaster.secondary) return 'RSSecondary';
if(ismaster.arbiterOnly) return 'RSArbiter';
@ -180,31 +180,6 @@ var inquireServerState = function(self) {
};
}
// Object.assign method or polyfille
var assign = Object.assign ? Object.assign : function assign(target) {
if (target === undefined || target === null) {
throw new TypeError('Cannot convert first argument to object');
}
var to = Object(target);
for (var i = 1; i < arguments.length; i++) {
var nextSource = arguments[i];
if (nextSource === undefined || nextSource === null) {
continue;
}
var keysArray = Object.keys(Object(nextSource));
for (var nextIndex = 0, len = keysArray.length; nextIndex < len; nextIndex++) {
var nextKey = keysArray[nextIndex];
var desc = Object.getOwnPropertyDescriptor(nextSource, nextKey);
if (desc !== undefined && desc.enumerable) {
to[nextKey] = nextSource[nextKey];
}
}
}
return to;
}
//
// Clone the options
var cloneOptions = function(options) {
@ -215,11 +190,145 @@ var cloneOptions = function(options) {
return opts;
}
function Interval(fn, time) {
var timer = false;
this.start = function () {
if (!this.isRunning()) {
timer = setInterval(fn, time);
}
return this;
};
this.stop = function () {
clearInterval(timer);
timer = false;
return this;
};
this.isRunning = function () {
return timer !== false;
};
}
function Timeout(fn, time) {
var timer = false;
this.start = function () {
if (!this.isRunning()) {
timer = setTimeout(function() {
fn();
if (timer && timer._called === undefined) {
// The artificial _called is set here for compatibility with node.js 0.10.x/0.12.x versions
timer._called = true;
}
}, time);
}
return this;
};
this.stop = function () {
clearTimeout(timer);
timer = false;
return this;
};
this.isRunning = function () {
if(timer && timer._called) return false;
return timer !== false;
};
}
function diff(previous, current) {
// Difference document
var diff = {
servers: []
}
// Previous entry
if(!previous) {
previous = { servers: [] };
}
// Check if we have any previous servers missing in the current ones
for(var i = 0; i < previous.servers.length; i++) {
var found = false;
for(var j = 0; j < current.servers.length; j++) {
if(current.servers[j].address.toLowerCase()
=== previous.servers[i].address.toLowerCase()) {
found = true;
break;
}
}
if(!found) {
// Add to the diff
diff.servers.push({
address: previous.servers[i].address,
from: previous.servers[i].type,
to: 'Unknown',
});
}
}
// Check if there are any severs that don't exist
for(var j = 0; j < current.servers.length; j++) {
var found = false;
// Go over all the previous servers
for(var i = 0; i < previous.servers.length; i++) {
if(previous.servers[i].address.toLowerCase()
=== current.servers[j].address.toLowerCase()) {
found = true;
break;
}
}
// Add the server to the diff
if(!found) {
diff.servers.push({
address: current.servers[j].address,
from: 'Unknown',
to: current.servers[j].type,
});
}
}
// Got through all the servers
for(var i = 0; i < previous.servers.length; i++) {
var prevServer = previous.servers[i];
// Go through all current servers
for(var j = 0; j < current.servers.length; j++) {
var currServer = current.servers[j];
// Matching server
if(prevServer.address.toLowerCase() === currServer.address.toLowerCase()) {
// We had a change in state
if(prevServer.type != currServer.type) {
diff.servers.push({
address: prevServer.address,
from: prevServer.type,
to: currServer.type
});
}
}
}
}
// Return difference
return diff;
}
module.exports.inquireServerState = inquireServerState
module.exports.getTopologyType = getTopologyType;
module.exports.emitServerDescriptionChanged = emitServerDescriptionChanged;
module.exports.emitTopologyDescriptionChanged = emitTopologyDescriptionChanged;
module.exports.cloneOptions = cloneOptions;
module.exports.assign = assign;
module.exports.createClientInfo = createClientInfo;
module.exports.clone = clone;
module.exports.diff = diff;
module.exports.Interval = Interval;
module.exports.Timeout = Timeout;

32
node/node_modules/mongodb-core/lib/utils.js generated vendored Normal file
View file

@ -0,0 +1,32 @@
/**
* Copy the values of all enumerable own properties from one or more
* source objects to a target object. It will return the target object.
*/
var assign = Object.assign ? Object.assign : function assign(target) {
if (target === undefined || target === null) {
throw new TypeError('Cannot convert first argument to object');
}
var to = Object(target);
for (var i = 1; i < arguments.length; i++) {
var nextSource = arguments[i];
if (nextSource === undefined || nextSource === null) {
continue;
}
var keysArray = Object.keys(Object(nextSource));
for (var nextIndex = 0, len = keysArray.length; nextIndex < len; nextIndex++) {
var nextKey = keysArray[nextIndex];
var desc = Object.getOwnPropertyDescriptor(nextSource, nextKey);
if (desc !== undefined && desc.enumerable) {
to[nextKey] = nextSource[nextKey];
}
}
}
return to;
}
module.exports = {
assign: assign
};

View file

@ -4,15 +4,18 @@ var Insert = require('./commands').Insert
, Update = require('./commands').Update
, Remove = require('./commands').Remove
, copy = require('../connection/utils').copy
, retrieveBSON = require('../connection/utils').retrieveBSON
, KillCursor = require('../connection/commands').KillCursor
, GetMore = require('../connection/commands').GetMore
, Query = require('../connection/commands').Query
, f = require('util').format
, CommandResult = require('../connection/command_result')
, MongoError = require('../error')
, Long = require('bson').Long
, getReadPreference = require('./shared').getReadPreference;
var BSON = retrieveBSON(),
Long = BSON.Long;
// Write concern fields
var writeConcernFields = ['w', 'wtimeout', 'j', 'fsync'];
@ -118,26 +121,29 @@ WireProtocol.prototype.getMore = function(bson, ns, cursorState, batchSize, raw,
callback(null, null, r.connection);
}
// Contains any query options
var queryOptions = {};
// If we have a raw query decorate the function
if(raw) {
queryCallback.raw = raw;
queryOptions.raw = raw;
}
// Check if we need to promote longs
if(typeof cursorState.promoteLongs == 'boolean') {
queryCallback.promoteLongs = cursorState.promoteLongs;
queryOptions.promoteLongs = cursorState.promoteLongs;
}
if(typeof cursorState.promoteValues == 'boolean') {
queryCallback.promoteValues = cursorState.promoteValues;
queryOptions.promoteValues = cursorState.promoteValues;
}
if(typeof cursorState.promoteBuffers == 'boolean') {
queryCallback.promoteBuffers = cursorState.promoteBuffers;
queryOptions.promoteBuffers = cursorState.promoteBuffers;
}
// Write out the getMore command
connection.write(getMore, queryCallback);
connection.write(getMore, queryOptions, queryCallback);
}
WireProtocol.prototype.command = function(bson, ns, cmd, cursorState, topology, options) {

View file

@ -1,14 +1,17 @@
"use strict";
var copy = require('../connection/utils').copy
, retrieveBSON = require('../connection/utils').retrieveBSON
, KillCursor = require('../connection/commands').KillCursor
, GetMore = require('../connection/commands').GetMore
, Query = require('../connection/commands').Query
, f = require('util').format
, MongoError = require('../error')
, Long = require('bson').Long
, getReadPreference = require('./shared').getReadPreference;
var BSON = retrieveBSON(),
Long = BSON.Long;
var WireProtocol = function() {}
//
@ -48,6 +51,7 @@ var executeWrite = function(pool, bson, type, opsField, ns, ops, options, callba
var opts = { command: true };
var queryOptions = { checkKeys : false, numberToSkip: 0, numberToReturn: 1 };
if(type == 'insert') queryOptions.checkKeys = true;
if(typeof options.checkKeys == 'boolean') queryOptions.checkKeys = options.checkKeys;
// Ensure we support serialization of functions
if(options.serializeFunctions) queryOptions.serializeFunctions = options.serializeFunctions;
// Do not serialize the undefined fields
@ -121,26 +125,29 @@ WireProtocol.prototype.getMore = function(bson, ns, cursorState, batchSize, raw,
callback(null, null, r.connection);
}
// Contains any query options
var queryOptions = {};
// If we have a raw query decorate the function
if(raw) {
queryCallback.raw = raw;
queryOptions.raw = raw;
}
// Check if we need to promote longs
if(typeof cursorState.promoteLongs == 'boolean') {
queryCallback.promoteLongs = cursorState.promoteLongs;
queryOptions.promoteLongs = cursorState.promoteLongs;
}
if(typeof cursorState.promoteValues == 'boolean') {
queryCallback.promoteValues = cursorState.promoteValues;
queryOptions.promoteValues = cursorState.promoteValues;
}
if(typeof cursorState.promoteBuffers == 'boolean') {
queryCallback.promoteBuffers = cursorState.promoteBuffers;
queryOptions.promoteBuffers = cursorState.promoteBuffers;
}
// Write out the getMore command
connection.write(getMore, queryCallback);
connection.write(getMore, queryOptions, queryCallback);
}
WireProtocol.prototype.command = function(bson, ns, cmd, cursorState, topology, options) {

View file

@ -1,11 +1,14 @@
"use strict";
var Query = require('../connection/commands').Query
, retrieveBSON = require('../connection/utils').retrieveBSON
, f = require('util').format
, MongoError = require('../error')
, Long = require('bson').Long
, getReadPreference = require('./shared').getReadPreference;
var BSON = retrieveBSON(),
Long = BSON.Long;
var WireProtocol = function(legacyWireProtocol) {
this.legacyWireProtocol = legacyWireProtocol;
}
@ -56,6 +59,7 @@ var executeWrite = function(pool, bson, type, opsField, ns, ops, options, callba
var opts = { command: true };
var queryOptions = { checkKeys : false, numberToSkip: 0, numberToReturn: 1 };
if(type == 'insert') queryOptions.checkKeys = true;
if(typeof options.checkKeys == 'boolean') queryOptions.checkKeys = options.checkKeys;
// Ensure we support serialization of functions
if(options.serializeFunctions) queryOptions.serializeFunctions = options.serializeFunctions;
@ -223,11 +227,11 @@ WireProtocol.prototype.getMore = function(bson, ns, cursorState, batchSize, raw,
}
if(typeof cursorState.promoteValues == 'boolean') {
queryCallback.promoteValues = cursorState.promoteValues;
queryOptions.promoteValues = cursorState.promoteValues;
}
if(typeof cursorState.promoteBuffers == 'boolean') {
queryCallback.promoteBuffers = cursorState.promoteBuffers;
queryOptions.promoteBuffers = cursorState.promoteBuffers;
}
// Write out the getMore command
@ -235,8 +239,12 @@ WireProtocol.prototype.getMore = function(bson, ns, cursorState, batchSize, raw,
}
WireProtocol.prototype.command = function(bson, ns, cmd, cursorState, topology, options) {
options = options || {}
// Check if this is a wire protocol command or not
var wireProtocolCommand = typeof options.wireProtocolCommand == 'boolean' ? options.wireProtocolCommand : true;
// Establish type of command
if(cmd.find) {
if(cmd.find && wireProtocolCommand) {
// Create the find command
var query = executeFindCommand(bson, ns, cmd, cursorState, topology, options)
// Mark the cmd as virtual
@ -385,8 +393,6 @@ var executeFindCommand = function(bson, ns, cmd, cursorState, topology, options)
if(cmd.skip) findCmd.skip = cmd.skip;
// Add a limit
if(cmd.limit) findCmd.limit = cmd.limit;
// Add a batchSize
if(typeof cmd.batchSize == 'number') findCmd.batchSize = Math.abs(cmd.batchSize);
// Check if we wish to have a singleBatch
if(cmd.limit < 0) {
@ -394,6 +400,19 @@ var executeFindCommand = function(bson, ns, cmd, cursorState, topology, options)
findCmd.singleBatch = true;
}
// Add a batchSize
if(typeof cmd.batchSize == 'number') {
if (cmd.batchSize < 0) {
if (cmd.limit != 0 && Math.abs(cmd.batchSize) < Math.abs(cmd.limit)) {
findCmd.limit = Math.abs(cmd.batchSize);
}
findCmd.singleBatch = true;
}
findCmd.batchSize = Math.abs(cmd.batchSize);
}
// If we have comment set
if(cmd.comment) findCmd.comment = cmd.comment;

View file

@ -12,7 +12,7 @@ var Insert = function(requestId, ismaster, bson, ns, documents, options) {
if(ns == null) throw new MongoError("ns must be specified for query");
if(!Array.isArray(documents) || documents.length == 0) throw new MongoError("documents array must contain at least one document to insert");
// Validate that we are not passing 0x00 in the colletion name
// Validate that we are not passing 0x00 in the collection name
if(!!~ns.indexOf("\x00")) {
throw new MongoError("namespace cannot contain a null character");
}
@ -56,11 +56,11 @@ Insert.prototype.toBin = function() {
// Serialize all the documents
for(var i = 0; i < this.documents.length; i++) {
var buffer = this.bson.serialize(this.documents[i]
, this.checkKeys
, true
, this.serializeFunctions
, 0, this.ignoreUndefined);
var buffer = this.bson.serialize(this.documents[i], {
checkKeys: this.checkKeys,
serializeFunctions: this.serializeFunctions,
ignoreUndefined: this.ignoreUndefined,
});
// Document is larger than maxBsonObjectSize, terminate serialization
if(buffer.length > this.ismaster.maxBsonObjectSize) {
@ -173,20 +173,20 @@ Update.prototype.toBin = function() {
var totalLength = header.length;
// Serialize the selector
var selector = this.bson.serialize(this.q
, this.checkKeys
, true
, this.serializeFunctions
, 0, this.ignoreUndefined);
var selector = this.bson.serialize(this.q, {
checkKeys: this.checkKeys,
serializeFunctions: this.serializeFunctions,
ignoreUndefined: this.ignoreUndefined,
});
buffers.push(selector);
totalLength = totalLength + selector.length;
// Serialize the update
var update = this.bson.serialize(this.u
, this.checkKeys
, true
, this.serializeFunctions
, 0, this.ignoreUndefined);
var update = this.bson.serialize(this.u, {
checkKeys: this.checkKeys,
serializeFunctions: this.serializeFunctions,
ignoreUndefined: this.ignoreUndefined,
});
buffers.push(update);
totalLength = totalLength + update.length;
@ -289,11 +289,11 @@ Remove.prototype.toBin = function() {
var totalLength = header.length;
// Serialize the selector
var selector = this.bson.serialize(this.q
, this.checkKeys
, true
, this.serializeFunctions
, 0, this.ignoreUndefined);
var selector = this.bson.serialize(this.q, {
checkKeys: this.checkKeys,
serializeFunctions: this.serializeFunctions,
ignoreUndefined: this.ignoreUndefined,
});
buffers.push(selector);
totalLength = totalLength + selector.length;

View file

@ -1,49 +1,26 @@
{
"_args": [
[
{
"raw": "mongodb-core@2.0.14",
"scope": null,
"escapedName": "mongodb-core",
"name": "mongodb-core",
"rawSpec": "2.0.14",
"spec": "2.0.14",
"type": "version"
},
"/Users/sclay/projects/newsblur/node/node_modules/mongodb"
]
],
"_from": "mongodb-core@2.0.14",
"_id": "mongodb-core@2.0.14",
"_inCache": true,
"_from": "mongodb-core@2.1.20",
"_id": "mongodb-core@2.1.20",
"_inBundle": false,
"_integrity": "sha512-IN57CX5/Q1bhDq6ShAR6gIv4koFsZP7L8WOK1S0lR0pVDQaScffSMV5jxubLsmZ7J+UdqmykKw4r9hG3XQEGgQ==",
"_location": "/mongodb-core",
"_nodeVersion": "7.1.0",
"_npmOperationalInternal": {
"host": "packages-12-west.internal.npmjs.com",
"tmp": "tmp/mongodb-core-2.0.14.tgz_1480414053968_0.9173268575686961"
},
"_npmUser": {
"name": "christkv",
"email": "christkv@gmail.com"
},
"_npmVersion": "3.10.9",
"_phantomChildren": {},
"_requested": {
"raw": "mongodb-core@2.0.14",
"scope": null,
"escapedName": "mongodb-core",
"type": "version",
"registry": true,
"raw": "mongodb-core@2.1.20",
"name": "mongodb-core",
"rawSpec": "2.0.14",
"spec": "2.0.14",
"type": "version"
"escapedName": "mongodb-core",
"rawSpec": "2.1.20",
"saveSpec": null,
"fetchSpec": "2.1.20"
},
"_requiredBy": [
"/mongodb"
],
"_resolved": "https://registry.npmjs.org/mongodb-core/-/mongodb-core-2.0.14.tgz",
"_shasum": "4e8743b87343d169a7622535edbd47dcacd790be",
"_shrinkwrap": null,
"_spec": "mongodb-core@2.0.14",
"_resolved": "https://registry.npmjs.org/mongodb-core/-/mongodb-core-2.1.20.tgz",
"_shasum": "fece8dd76b59ee7d7f2d313b65322c160492d8f1",
"_spec": "mongodb-core@2.1.20",
"_where": "/Users/sclay/projects/newsblur/node/node_modules/mongodb",
"author": {
"name": "Christian Kvalheim"
@ -51,32 +28,30 @@
"bugs": {
"url": "https://github.com/christkv/mongodb-core/issues"
},
"bundleDependencies": false,
"dependencies": {
"bson": "~0.5.7",
"bson": "~1.0.4",
"require_optional": "~1.0.0"
},
"deprecated": false,
"description": "Core MongoDB driver functionality, no bells and whistles and meant for integration not end applications",
"devDependencies": {
"co": "^4.5.4",
"conventional-changelog-cli": "^1.3.5",
"coveralls": "^2.11.6",
"es6-promise": "^3.0.2",
"gleak": "0.5.0",
"integra": "0.1.8",
"jsdoc": "3.3.0-alpha8",
"mkdirp": "0.5.0",
"mongodb-topology-manager": "1.0.6",
"mongodb-topology-manager": "1.0.x",
"mongodb-version-manager": "github:christkv/mongodb-version-manager#master",
"nyc": "^5.5.0",
"optimist": "latest",
"request": "2.65.0",
"rimraf": "2.2.6",
"semver": "4.1.0"
},
"directories": {},
"dist": {
"shasum": "4e8743b87343d169a7622535edbd47dcacd790be",
"tarball": "https://registry.npmjs.org/mongodb-core/-/mongodb-core-2.0.14.tgz"
},
"gitHead": "90884c683392b9c4643ac4f9c94f9f9d08bd3ff7",
"homepage": "https://github.com/christkv/mongodb-core",
"keywords": [
"mongodb",
@ -84,26 +59,20 @@
],
"license": "Apache-2.0",
"main": "index.js",
"maintainers": [
{
"name": "christkv",
"email": "christkv@gmail.com"
}
],
"name": "mongodb-core",
"optionalDependencies": {},
"peerOptionalDependencies": {
"kerberos": "~0.0"
"kerberos": "~0.0",
"bson-ext": "1.0.5"
},
"readme": "ERROR: No README data found!",
"repository": {
"type": "git",
"url": "git://github.com/christkv/mongodb-core.git"
},
"scripts": {
"changelog": "conventional-changelog -p angular -i HISTORY.md -s",
"coverage": "nyc node test/runner.js -t functional -l && node_modules/.bin/nyc report --reporter=text-lcov | node_modules/.bin/coveralls",
"lint": "eslint lib",
"test": "node test/runner.js -t functional"
},
"version": "2.0.14"
"version": "2.1.20"
}

File diff suppressed because it is too large Load diff

View file

@ -1,22 +0,0 @@
var ReplSet = require('./').ReplSet;
var replSet = new ReplSet([
{ host: '10.211.55.6', port: 31000 },
{ host: '10.211.55.12', port: 31001 },
{ host: '10.211.55.13', port: 31002 }
], {
connectionTimeout: 10000,
});
replSet.on('connect', function(_server) {
setInterval(function() {
_server.command('system.$cmd', { ping: 1 }, function(err, result) {
if (err) {
console.error(err);
} else {
console.log(result.result);
}
});
}, 1000);
});
replSet.connect()

3254
node/node_modules/mongodb-core/yarn.lock generated vendored Normal file

File diff suppressed because it is too large Load diff

214
node/node_modules/mongodb/HISTORY.md generated vendored
View file

@ -1,4 +1,214 @@
2.2.12 2016-10-21
<a name="2.2.35"></a>
## [2.2.35](https://github.com/mongodb/node-mongodb-native/compare/v2.2.34...v2.2.35) (2018-02-26)
### Bug Fixes
* **url parser:** preserve auth creds when composing conn string ([#1641](https://github.com/mongodb/node-mongodb-native/issues/1641)) ([ecedce6](https://github.com/mongodb/node-mongodb-native/commit/ecedce6))
### Features
* **core**: update mongodb-core to 2.1.19
<a name="2.2.34"></a>
## [2.2.34](https://github.com/mongodb/node-mongodb-native/compare/v2.2.33...v2.2.34) (2018-01-03)
### Bug Fixes
* **collection:** allow { upsert: 1 } for findOneAndUpdate() and update() ([#1580](https://github.com/mongodb/node-mongodb-native/issues/1580)) ([0f338c8](https://github.com/mongodb/node-mongodb-native/commit/0f338c8)), closes [Automattic/mongoose#5839](https://github.com/Automattic/mongoose/issues/5839)
* **GridFS:** fix TypeError: doc.data.length is not a function ([811de0c](https://github.com/mongodb/node-mongodb-native/commit/811de0c))
* **import:** adds missing import to lib/authenticate.js ([10db9a2](https://github.com/mongodb/node-mongodb-native/commit/10db9a2))
* **list-collections:** ensure default of primary ReadPreference ([0935306](https://github.com/mongodb/node-mongodb-native/commit/0935306))
### Features
* **ssl:** adds missing ssl options ssl options for `ciphers` and `ecdhCurve` ([bd4fb53](https://github.com/mongodb/node-mongodb-native/commit/bd4fb53))
* **url parser:** add dns seedlist support ([2d357bc](https://github.com/mongodb/node-mongodb-native/commit/2d357bc))
* **core**: update mongodb-core to 2.1.18
2.2.33 2017-10-12
-----------------
* update to mongodb-core 2.1.17
2.2.32 2017-10-11
-----------------
* update to mongodb-core 2.1.16
* ensure that the `cursor` key is always present in aggregation commands
* `Cursor.prototype.hasNext` now propagates errors when using callback
* allow passing `noCursorTimeout` as an option to `find()`
* bubble up `reconnectFailed` event from Server topology
2.2.31 2017-08-08
-----------------
* update mongodb-core to 2.1.15
* allow auth option in MongoClient.connect
* remove duplicate option `promoteLongs` from MongoClient's `connect`
* bulk operations should not throw an error on empty batch
2.2.30 2017-07-07
-----------------
* Update mongodb-core to 2.1.14
* MongoClient
* add `appname` to list of valid option names
* added test for passing appname as option
* NODE-1052 ensure user options are applied while parsing connection string uris
2.2.29 2017-06-19
-----------------
* Update mongodb-core to 2.1.13
* NODE-1039 ensure we force destroy server instances, forcing queue to be flushed.
* Use actual server type in standalone SDAM events.
* Allow multiple map calls (Issue #1521, https://github.com/Robbilie).
* Clone insertMany options before mutating (Issue #1522, https://github.com/vkarpov15).
* NODE-1034 Fix GridStore issue caused by Node 8.0.0 breaking backward compatible fs.read API.
* NODE-1026, use operator instead of skip function in order to avoid useless fetch stage.
2.2.28 2017-06-02
-----------------
* Update mongodb-core to 2.1.12
* NODE-1019 Set keepAlive to 300 seconds or 1/2 of socketTimeout if socketTimeout < keepAlive.
* Minor fix to report the correct state on error.
* NODE-1020 'family' was added to options to provide high priority for ipv6 addresses (Issue #1518, https://github.com/firej).
* Fix require_optional loading of bson-ext.
* Ensure no errors are thrown by replset if topology is destroyed before it finished connecting.
* NODE-999 SDAM fixes for Mongos and single Server event emitting.
* NODE-1014 Set socketTimeout to default to 360 seconds.
* NODE-1019 Set keepAlive to 300 seconds or 1/2 of socketTimeout if socketTimeout < keepAlive.
* Just handle Collection name errors distinctly from general callback errors avoiding double callbacks in Db.collection.
* NODE-999 SDAM fixes for Mongos and single Server event emitting.
* NODE-1000 Added guard condition for upload.js checkDone function in case of race condition caused by late arriving chunk write.
2.2.27 2017-05-22
-----------------
* Updated mongodb-core to 2.1.11
* NODE-987 Clear out old intervalIds on when calling topologyMonitor.
* NODE-987 Moved filtering to pingServer method and added test case.
* Check for connection destroyed just before writing out and flush out operations correctly if it is (Issue #179, https://github.com/jmholzinger).
* NODE-989 Refactored Replicaset monitoring to correcly monitor newly added servers, Also extracted setTimeout and setInterval to use custom wrappers Timeout and Interval.
* NODE-985 Deprecated Db.authenticate and Admin.authenticate and moved auth methods into authenticate.js to ensure MongoClient.connect does not print deprecation warnings.
* NODE-988 Merged readConcern and hint correctly on collection(...).find(...).count()
* Fix passing the readConcern option to MongoClient.connect (Issue #1514, https://github.com/bausmeier).
* NODE-996 Propegate all events up to a MongoClient instance.
* Allow saving doc with null `_id` (Issue #1517, https://github.com/vkarpov15).
* NODE-993 Expose hasNext for command cursor and add docs for both CommandCursor and Aggregation Cursor.
2.2.26 2017-04-18
-----------------
* Updated mongodb-core to 2.1.10
* NODE-981 delegate auth to replset/mongos if inTopology is set.
* NODE-978 Wrap connection.end in try/catch for node 0.10.x issue causing exceptions to be thrown, Also surfaced getConnection for mongos and replset.
* Remove dynamic require (Issue #175, https://github.com/tellnes).
* NODE-696 Handle interrupted error for createIndexes.
* Fixed isse when user is executing find command using Server.command and it get interpreted as a wire protcol message, #172.
* NODE-966 promoteValues not being promoted correctly to getMore.
* Merged in fix for flushing out monitoring operations.
* NODE-983 Add cursorId to aggregate and listCollections commands (Issue, #1510).
* Mark group and profilingInfo as deprecated methods
* NODE-956 DOCS Examples.
* Update readable-stream to version 2.2.7.
* NODE-978 Added test case to uncover connection.end issue for node 0.10.x.
* NODE-972 Fix(db): don't remove database name if collectionName == dbName (Issue, #1502)
* Fixed merging of writeConcerns on db.collection method.
* NODE-970 mix in readPreference for strict mode listCollections callback.
* NODE-966 added testcase for promoteValues being applied to getMore commands.
* NODE-962 Merge in ignoreUndefined from collection level for find/findOne.
* Remove multi option from updateMany tests/docs (Issue #1499, https://github.com/spratt).
* NODE-963 Correctly handle cursor.count when using APM.
2.2.25 2017-03-17
-----------------
* Don't rely on global toString() for checking if object (Issue #1494, https://github.com/vkarpov15).
* Remove obsolete option uri_decode_auth (Issue #1488, https://github.com/kamagatos).
* NODE-936 Correctly translate ReadPreference to CoreReadPreference for mongos queries.
* Exposed BSONRegExp type.
* NODE-950 push correct index for INSERT ops (https://github.com/mbroadst).
* NODE-951 Added support for sslCRL option and added a test case for it.
* NODE-953 Made batchSize issue general at cursor level.
* NODE-954 Remove write concern from reindex helper as it will not be supported in 3.6.
* Updated mongodb-core to 2.1.9.
* Return lastIsMaster correctly when connecting with secondaryOnlyConnectionAllowed is set to true and only a secondary is available in replica state.
* Clone options when passed to wireProtocol handler to avoid intermittent modifications causing errors.
* Ensure SSL error propegates better for Replset connections when there is a SSL validation error.
* NODE-957 Fixed issue where < batchSize not causing cursor to be closed on execution of first batch.
* NODE-958 Store reconnectConnection on pool object to allow destroy to close immediately.
2.2.24 2017-02-14
-----------------
* NODE-935, NODE-931 Make MongoClient strict options validation optional and instead print annoying console.warn entries.
2.2.23 2017-02-13
-----------------
* Updated mongodb-core to 2.1.8.
* NODE-925 ensure we reschedule operations while pool is < poolSize while pool is growing and there are no connections with not currently performing work.
* NODE-927 fixes issue where authentication was performed against arbiter instances.
* NODE-915 Normalize all host names to avoid comparison issues.
* Fixed issue where pool.destroy would never finish due to a single operation not being executed and keeping it open.
* NODE-931 Validates all the options for MongoClient.connect and fixes missing connection settings.
* NODE-929 Update SSL tutorial to correctly reflect the non-need for server/mongos/replset subobjects
* Fix sensitive command check (Issue #1473, https://github.com/Annoraaq)
2.2.22 2017-01-24
-----------------
* Updated mongodb-core to 2.1.7.
* NODE-919 ReplicaSet connection does not close immediately (Issue #156).
* NODE-901 Fixed bug when normalizing host names.
* NODE-909 Fixed readPreference issue caused by direct connection to primary.
* NODE-910 Fixed issue when bufferMaxEntries == 0 and read preference set to nearest.
* Add missing unref implementations for replset, mongos (Issue #1455, https://github.com/zbjornson)
2.2.21 2017-01-13
-----------------
* Updated mongodb-core to 2.1.6.
* NODE-908 Keep auth contexts in replset and mongos topology to ensure correct application of authentication credentials when primary is first server to be detected causing an immediate connect event to happen.
2.2.20 2017-01-11
-----------------
* Updated mongodb-core to 2.1.5 to include bson 1.0.4 and bson-ext 1.0.4 due to Buffer.from being broken in early node 4.x versions.
2.2.19 2017-01-03
-----------------
* Corrupted Npm release fix.
2.2.18 2017-01-03
-----------------
* Updated mongodb-core to 2.1.4 to fix bson ObjectId toString issue with utils.inspect messing with toString parameters in node 6.
2.2.17 2017-01-02
-----------------
* updated createCollection doc options and linked to create command.
* Updated mongodb-core to 2.1.3.
* Monitoring operations are re-scheduled in pool if it cannot find a connection that does not already have scheduled work on it, this is to avoid the monitoring socket timeout being applied to any existing operations on the socket due to pipelining
* Moved replicaset monitoring away from serial mode and to parallel mode.
* updated bson and bson-ext dependencies to 1.0.2.
2.2.16 2016-12-13
-----------------
* NODE-899 reversed upsertedId change to bring back old behavior.
2.2.15 2016-12-10
-----------------
* Updated mongodb-core to 2.1.2.
* Delay topologyMonitoring on successful attemptReconnect as no need to run a full scan immediately.
* Emit reconnect event in primary joining when in connected status for a replicaset (Fixes mongoose reconnect issue).
2.2.14 2016-12-08
-----------------
* Updated mongodb-core to 2.1.1.
* NODE-892 Passthrough options.readPreference to mongodb-core ReplSet instance.
2.2.13 2016-12-05
-----------------
* Updated mongodb-core to 2.1.0.
* NODE-889 Fixed issue where legacy killcursor wire protocol messages would not be sent when APM is enabled.
* Expose parserType as property on topology objects.
2.2.12 2016-11-29
-----------------
* Updated mongodb-core to 2.0.14.
* Updated bson library to 0.5.7.
@ -21,7 +231,7 @@
* NODE-864 close event not emits during network issues using single server topology.
* Introduced maxStalenessSeconds.
* NODE-840 Added CRUD specification test cases and fix minor issues with upserts reporting matchedCount > 0.
* Don't ignore Db-level authSource when using auth method. (https://github.com/donaldguy).
* Don't ignore Db-level authSource when using auth method. (https://github.com/donaldguy).
2.2.11 2016-10-21
-----------------

View file

@ -2,7 +2,7 @@
[![Build Status](https://secure.travis-ci.org/mongodb/node-mongodb-native.svg?branch=2.1)](http://travis-ci.org/mongodb/node-mongodb-native)
[![Coverage Status](https://coveralls.io/repos/github/mongodb/node-mongodb-native/badge.svg?branch=2.1)](https://coveralls.io/github/mongodb/node-mongodb-native?branch=2.1)
[![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/mongodb/node-mongodb-native?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
[![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/mongodb/node-mongodb-native?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
# Description
@ -325,7 +325,7 @@ Next lets delete the document where the field **a** equals to **3**.
var deleteDocument = function(db, callback) {
// Get the documents collection
var collection = db.collection('documents');
// Insert some documents
// Delete document where a is 3
collection.deleteOne({ a : 3 }, function(err, result) {
assert.equal(err, null);
assert.equal(1, result.result.n);

View file

@ -36,7 +36,10 @@
"node_modules/bson/lib/bson/symbol.js",
"node_modules/bson/lib/bson/timestamp.js",
"node_modules/bson/lib/bson/max_key.js",
"node_modules/bson/lib/bson/min_key.js"
"node_modules/bson/lib/bson/min_key.js",
"node_modules/bson/lib/bson/decimal128.js",
"node_modules/bson/lib/bson/int_32.js",
"node_modules/bson/lib/bson/regexp.js"
]
},
"templates": {
@ -46,7 +49,6 @@
"outputSourceFiles" : true
},
"applicationName": "Node.js MongoDB Driver API",
"disqus": true,
"googleAnalytics": "UA-29229787-1",
"openGraph": {
"title": "",

1
node/node_modules/mongodb/index.js generated vendored
View file

@ -40,6 +40,7 @@ connect.ObjectID = core.BSON.ObjectID;
connect.ObjectId = core.BSON.ObjectID;
connect.Symbol = core.BSON.Symbol;
connect.Timestamp = core.BSON.Timestamp;
connect.BSONRegExp = core.BSON.BSONRegExp;
connect.Decimal128 = core.BSON.Decimal128;
// Add connect method

View file

@ -2,7 +2,9 @@
var toError = require('./utils').toError,
Define = require('./metadata'),
shallowClone = require('./utils').shallowClone;
shallowClone = require('./utils').shallowClone,
assign = require('./utils').assign,
authenticate = require('./authenticate');
/**
* @fileOverview The **Admin** class is an internal class that allows convenient access to
@ -239,23 +241,23 @@ define.classMethod('ping', {callback: true, promise:true});
* @param {string} [password] The password.
* @param {Admin~resultCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
* @deprecated This method will no longer be available in the next major release 3.x as MongoDB 3.6 will only allow auth against users in the admin db and will no longer allow multiple credentials on a socket. Please authenticate using MongoClient.connect with auth credentials.
*/
Admin.prototype.authenticate = function(username, password, options, callback) {
var self = this;
if(typeof options == 'function') callback = options, options = {};
options = shallowClone(options);
options.authdb = 'admin';
console.warn("Admin.prototype.authenticate method will no longer be available in the next major release 3.x as MongoDB 3.6 will only allow auth against users in the admin db and will no longer allow multiple credentials on a socket. Please authenticate using MongoClient.connect with auth credentials.");
var finalArguments = [this.s.db];
if(typeof username == 'string') finalArguments.push(username);
if(typeof password == 'string') finalArguments.push(password);
if(typeof options == 'function') {
finalArguments.push({ authdb: 'admin' });
finalArguments.push(options);
} else {
finalArguments.push(assign({}, options, { authdb: 'admin' }));
}
// Execute using callback
if(typeof callback == 'function') return this.s.db.authenticate(username, password, options, callback);
// Return a Promise
return new this.s.promiseLibrary(function(resolve, reject) {
self.s.db.authenticate(username, password, options, function(err, r) {
if(err) return reject(err);
resolve(r);
});
});
if(typeof callback == 'function') finalArguments.push(callback);
// Execute authenticate method
return authenticate.apply(this.s.db, finalArguments);
}
define.classMethod('authenticate', {callback: true, promise:true});
@ -433,10 +435,11 @@ var setProfilingLevel = function(self, level, callback) {
define.classMethod('setProfilingLevel', {callback: true, promise:true});
/**
* Retrive the current profiling information for MongoDB
* Retrieve the current profiling information for MongoDB
*
* @param {Admin~resultCallback} [callback] The command result callback.
* @return {Promise} returns Promise if no callback passed
* @deprecated Query the system.profile collection directly.
*/
Admin.prototype.profilingInfo = function(callback) {
var self = this;

View file

@ -82,7 +82,7 @@ var AggregationCursor = function(bson, ns, cmd, options, topology, topologyOptio
, streamOptions: streamOptions
// BSON
, bson: bson
// Namespae
// Namespace
, ns: ns
// Command
, cmd: cmd
@ -144,7 +144,7 @@ var define = AggregationCursor.define = new Define('AggregationCursor', Aggregat
*/
AggregationCursor.prototype.batchSize = function(value) {
if(this.s.state == AggregationCursor.CLOSED || this.isDead()) throw MongoError.create({message: "Cursor is closed", driver:true });
if(typeof value != 'number') throw MongoError.create({message: "batchSize requires an integer", drvier:true });
if(typeof value != 'number') throw MongoError.create({message: "batchSize requires an integer", driver:true });
if(this.s.cmd.cursor) this.s.cmd.cursor.batchSize = value;
this.setCursorBatchSize(value);
return this;
@ -316,6 +316,7 @@ AggregationCursor.prototype.get = AggregationCursor.prototype.toArray;
define.classMethod('toArray', {callback: true, promise:true});
define.classMethod('each', {callback: true, promise:false});
define.classMethod('forEach', {callback: true, promise:false});
define.classMethod('hasNext', {callback: true, promise:true});
define.classMethod('next', {callback: true, promise:true});
define.classMethod('close', {callback: true, promise:true});
define.classMethod('isClosed', {callback: false, promise:false, returns: [Boolean]});
@ -331,6 +332,14 @@ define.classMethod('readBufferedDocuments', {callback: false, promise:false, ret
* @return {Promise} returns Promise if no callback passed
*/
/**
* Check if there is any document still available in the cursor
* @function AggregationCursor.prototype.hasNext
* @param {AggregationCursor~resultCallback} [callback] The result callback.
* @throws {MongoError}
* @return {Promise} returns Promise if no callback passed
*/
/**
* The callback format for results
* @callback AggregationCursor~toArrayResultCallback
@ -341,7 +350,7 @@ define.classMethod('readBufferedDocuments', {callback: false, promise:false, ret
/**
* Returns an array of documents. The caller is responsible for making sure that there
* is enough memory to store the results. Note that the array only contain partial
* results when this cursor had been previouly accessed. In that case,
* results when this cursor had been previously accessed. In that case,
* cursor.rewind() can be used to reset the cursor.
* @method AggregationCursor.prototype.toArray
* @param {AggregationCursor~toArrayResultCallback} [callback] The result callback.
@ -358,7 +367,7 @@ define.classMethod('readBufferedDocuments', {callback: false, promise:false, ret
/**
* Iterates over all the documents for this cursor. As with **{cursor.toArray}**,
* not all of the elements will be iterated if this cursor had been previouly accessed.
* not all of the elements will be iterated if this cursor had been previously accessed.
* In that case, **{cursor.rewind}** can be used to reset the cursor. However, unlike
* **{cursor.toArray}**, the cursor will only hold a maximum of batch size elements
* at any given time if batch size is specified. Otherwise, the caller is responsible

20
node/node_modules/mongodb/lib/apm.js generated vendored
View file

@ -96,7 +96,7 @@ var Instrumentation = function(core, options, callback) {
// The actual prototype
proto[x] = function() {
var requestId = core.Query.nextRequestId();
// Get the aruments
// Get the arguments
var args = Array.prototype.slice.call(arguments, 0);
var ns = args[0];
var commandObj = args[1];
@ -181,7 +181,7 @@ var Instrumentation = function(core, options, callback) {
};
// Filter out any sensitive commands
if(senstiveCommands.indexOf(commandName.toLowerCase())) {
if(senstiveCommands.indexOf(commandName.toLowerCase()) != -1) {
command.commandObj = {};
command.commandObj[commandName] = true;
}
@ -208,7 +208,7 @@ var Instrumentation = function(core, options, callback) {
command.failure = err || r.result.writeErrors || r.result;
// Filter out any sensitive commands
if(senstiveCommands.indexOf(commandName.toLowerCase())) {
if(senstiveCommands.indexOf(commandName.toLowerCase()) != -1) {
command.failure = {};
}
@ -261,7 +261,7 @@ var Instrumentation = function(core, options, callback) {
// The actual prototype
proto[x] = function() {
// Get the aruments
// Get the arguments
var args = Array.prototype.slice.call(arguments, 0);
// Set an operation Id on the bulk object
this.operationId = operationIdGenerator.next();
@ -342,7 +342,7 @@ var Instrumentation = function(core, options, callback) {
}
if(cmd.maxTimeMS) command.maxTimeMS = cmd.maxTimeMS;
} else if(x == '_killcursors') {
} else if(x == '_killcursor') {
command = {
killCursors: collection,
cursors: [this.cursorState.cursorId]
@ -435,7 +435,7 @@ var Instrumentation = function(core, options, callback) {
connectionId: connectionId
};
// Get the aruments
// Get the arguments
var args = Array.prototype.slice.call(arguments, 0);
// Get the callback
@ -461,6 +461,10 @@ var Instrumentation = function(core, options, callback) {
reply: [{ok:1}]
};
// Apply callback to the list of args
args.push(callback);
// Apply the call
func.apply(this, args);
// Emit the command
return self.emit('succeeded', command)
}
@ -489,7 +493,9 @@ var Instrumentation = function(core, options, callback) {
nextBatch: cursor.cursorState.documents
}, ok:1
}
} else if(commandName.toLowerCase() == 'find' && r == null) {
} else if((commandName.toLowerCase() == 'find'
|| commandName.toLowerCase() == 'aggregate'
|| commandName.toLowerCase() == 'listcollections') && r == null) {
r = {
cursor: {
id: cursor.cursorState.cursorId,

110
node/node_modules/mongodb/lib/authenticate.js generated vendored Normal file
View file

@ -0,0 +1,110 @@
var shallowClone = require('./utils').shallowClone
, handleCallback = require('./utils').handleCallback
, MongoError = require('mongodb-core').MongoError
, f = require('util').format;
var authenticate = function(self, username, password, options, callback) {
// Did the user destroy the topology
if(self.serverConfig && self.serverConfig.isDestroyed()) return callback(new MongoError('topology was destroyed'));
// the default db to authenticate against is 'self'
// if authenticate is called from a retry context, it may be another one, like admin
var authdb = options.dbName ? options.dbName : self.databaseName;
authdb = self.authSource ? self.authSource : authdb;
authdb = options.authdb ? options.authdb : authdb;
authdb = options.authSource ? options.authSource : authdb;
// Callback
var _callback = function(err, result) {
if(self.listeners('authenticated').length > 0) {
self.emit('authenticated', err, result);
}
// Return to caller
handleCallback(callback, err, result);
}
// authMechanism
var authMechanism = options.authMechanism || '';
authMechanism = authMechanism.toUpperCase();
// If classic auth delegate to auth command
if(authMechanism == 'MONGODB-CR') {
self.s.topology.auth('mongocr', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'PLAIN') {
self.s.topology.auth('plain', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'MONGODB-X509') {
self.s.topology.auth('x509', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'SCRAM-SHA-1') {
self.s.topology.auth('scram-sha-1', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'GSSAPI') {
if(process.platform == 'win32') {
self.s.topology.auth('sspi', authdb, username, password, options, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else {
self.s.topology.auth('gssapi', authdb, username, password, options, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
}
} else if(authMechanism == 'DEFAULT') {
self.s.topology.auth('default', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else {
handleCallback(callback, MongoError.create({message: f("authentication mechanism %s not supported", options.authMechanism), driver:true}));
}
}
module.exports = function(self, username, password, options, callback) {
if(typeof options == 'function') callback = options, options = {};
// Shallow copy the options
options = shallowClone(options);
// Set default mechanism
if(!options.authMechanism) {
options.authMechanism = 'DEFAULT';
} else if(options.authMechanism != 'GSSAPI'
&& options.authMechanism != 'DEFAULT'
&& options.authMechanism != 'MONGODB-CR'
&& options.authMechanism != 'MONGODB-X509'
&& options.authMechanism != 'SCRAM-SHA-1'
&& options.authMechanism != 'PLAIN') {
return handleCallback(callback, MongoError.create({message: "only DEFAULT, GSSAPI, PLAIN, MONGODB-X509, SCRAM-SHA-1 or MONGODB-CR is supported by authMechanism", driver:true}));
}
// If we have a callback fallback
if(typeof callback == 'function') return authenticate(self, username, password, options, function(err, r) {
// Support failed auth method
if(err && err.message && err.message.indexOf('saslStart') != -1) err.code = 59;
// Reject error
if(err) return callback(err, r);
callback(null, r);
});
// Return a promise
return new self.s.promiseLibrary(function(resolve, reject) {
authenticate(self, username, password, options, function(err, r) {
// Support failed auth method
if(err && err.message && err.message.indexOf('saslStart') != -1) err.code = 59;
// Reject error
if(err) return reject(err);
resolve(r);
});
});
};

View file

@ -151,6 +151,7 @@ var BulkWriteResult = function(bulkResult) {
/**
* Returns a specific write error object
*
* @param {number} index of the write error to return, returns null if there is no result for passed in index
* @return {WriteError}
*/
this.getWriteErrorAt = function(index) {
@ -312,7 +313,7 @@ var mergeBatchResults = function(ordered, batch, bulkResult, err, result) {
var lastOpT = null;
// We have a time stamp
if(opTime instanceof Timestamp) {
if(opTime && opTime._bsontype == 'Timestamp') {
if(bulkResult.lastOp == null) {
bulkResult.lastOp = opTime;
} else if(opTime.greaterThan(bulkResult.lastOp)) {

View file

@ -12,7 +12,9 @@ var common = require('./common')
, Batch = common.Batch
, mergeBatchResults = common.mergeBatchResults;
var bson = new BSON.BSONPure();
var bson = new BSON([BSON.Binary, BSON.Code, BSON.DBRef, BSON.Decimal128,
BSON.Double, BSON.Int32, BSON.Long, BSON.Map, BSON.MaxKey, BSON.MinKey,
BSON.ObjectId, BSON.BSONRegExp, BSON.Symbol, BSON.Timestamp]);
/**
* Create a FindOperatorsOrdered instance (INTERNAL TYPE, do not instantiate directly)
@ -148,7 +150,9 @@ FindOperatorsOrdered.prototype.remove = FindOperatorsOrdered.prototype.delete;
// Add to internal list of documents
var addToOperationsList = function(_self, docType, document) {
// Get the bsonSize
var bsonSize = bson.calculateObjectSize(document, false);
var bsonSize = bson.calculateObjectSize(document, {
checkKeys: false,
});
// Throw error if the doc is bigger than the max BSON size
if(bsonSize >= _self.s.maxBatchSizeBytes) {
@ -484,30 +488,37 @@ var executeCommands = function(self, callback) {
*/
OrderedBulkOperation.prototype.execute = function(_writeConcern, callback) {
var self = this;
if(this.s.executed) throw new toError("batch cannot be re-executed");
if(typeof _writeConcern == 'function') {
if (this.s.executed) {
var executedError = toError('batch cannot be re-executed');
return (typeof callback === 'function') ?
callback(executedError, null) : this.s.promiseLibrary.reject(executedError);
}
if (typeof _writeConcern === 'function') {
callback = _writeConcern;
} else if(_writeConcern && typeof _writeConcern == 'object') {
} else if (_writeConcern && typeof _writeConcern === 'object') {
this.s.writeConcern = _writeConcern;
}
// If we have current batch
if(this.s.currentBatch) this.s.batches.push(this.s.currentBatch)
if (this.s.currentBatch) this.s.batches.push(this.s.currentBatch)
// If we have no operations in the bulk raise an error
if(this.s.batches.length == 0) {
throw toError("Invalid Operation, No operations in bulk");
if (this.s.batches.length == 0) {
var emptyBatchError = toError('Invalid Operation, no operations specified');
return (typeof callback === 'function') ?
callback(emptyBatchError, null) : this.s.promiseLibrary.reject(emptyBatchError);
}
// Execute using callback
if(typeof callback == 'function') {
return executeCommands(this, callback);
}
if (typeof callback === 'function') {
return executeCommands(this, callback);
}
// Return a Promise
return new this.s.promiseLibrary(function(resolve, reject) {
executeCommands(self, function(err, r) {
if(err) return reject(err);
if (err) return reject(err);
resolve(r);
});
});

View file

@ -12,7 +12,9 @@ var common = require('./common')
, Batch = common.Batch
, mergeBatchResults = common.mergeBatchResults;
var bson = new BSON.BSONPure();
var bson = new BSON([BSON.Binary, BSON.Code, BSON.DBRef, BSON.Decimal128,
BSON.Double, BSON.Int32, BSON.Long, BSON.Map, BSON.MaxKey, BSON.MinKey,
BSON.ObjectId, BSON.BSONRegExp, BSON.Symbol, BSON.Timestamp]);
/**
* Create a FindOperatorsUnordered instance (INTERNAL TYPE, do not instantiate directly)
@ -93,7 +95,7 @@ FindOperatorsUnordered.prototype.replaceOne = function(updateDocument) {
*
* @method
* @throws {MongoError}
* @return {UnorderedBulkOperation}
* @return {FindOperatorsUnordered}
*/
FindOperatorsUnordered.prototype.upsert = function() {
this.s.currentOp.upsert = true;
@ -145,7 +147,9 @@ FindOperatorsUnordered.prototype.remove = function() {
//
var addToOperationsList = function(_self, docType, document) {
// Get the bsonSize
var bsonSize = bson.calculateObjectSize(document, false);
var bsonSize = bson.calculateObjectSize(document, {
checkKeys: false,
});
// Throw error if the doc is bigger than the max BSON size
if(bsonSize >= _self.s.maxBatchSizeBytes) throw toError("document is larger than the maximum size " + _self.s.maxBatchSizeBytes);
// Holds the current batch
@ -185,7 +189,7 @@ var addToOperationsList = function(_self, docType, document) {
// Save back the current Batch to the right type
if(docType == common.INSERT) {
_self.s.currentInsertBatch = _self.s.currentBatch;
_self.s.bulkResult.insertedIds.push({index: _self.s.currentIndex, _id: document._id});
_self.s.bulkResult.insertedIds.push({index: _self.s.bulkResult.insertedIds.length, _id: document._id});
} else if(docType == common.UPDATE) {
_self.s.currentUpdateBatch = _self.s.currentBatch;
} else if(docType == common.REMOVE) {
@ -209,7 +213,7 @@ var addToOperationsList = function(_self, docType, document) {
var UnorderedBulkOperation = function(topology, collection, options) {
options = options == null ? {} : options;
// Get the namesspace for the write operations
// Get the namespace for the write operations
var namespace = collection.collectionName;
// Used to mark operation as executed
var executed = false;
@ -486,30 +490,37 @@ var executeBatches = function(self, callback) {
*/
UnorderedBulkOperation.prototype.execute = function(_writeConcern, callback) {
var self = this;
if(this.s.executed) throw toError("batch cannot be re-executed");
if(typeof _writeConcern == 'function') {
if (this.s.executed) {
var executedError = toError('batch cannot be re-executed');
return (typeof callback === 'function') ?
callback(executedError, null) : this.s.promiseLibrary.reject(executedError);
}
if (typeof _writeConcern === 'function') {
callback = _writeConcern;
} else if(_writeConcern && typeof _writeConcern == 'object') {
} else if (_writeConcern && typeof _writeConcern === 'object') {
this.s.writeConcern = _writeConcern;
}
// If we have current batch
if(this.s.currentInsertBatch) this.s.batches.push(this.s.currentInsertBatch);
if(this.s.currentUpdateBatch) this.s.batches.push(this.s.currentUpdateBatch);
if(this.s.currentRemoveBatch) this.s.batches.push(this.s.currentRemoveBatch);
if (this.s.currentInsertBatch) this.s.batches.push(this.s.currentInsertBatch);
if (this.s.currentUpdateBatch) this.s.batches.push(this.s.currentUpdateBatch);
if (this.s.currentRemoveBatch) this.s.batches.push(this.s.currentRemoveBatch);
// If we have no operations in the bulk raise an error
if(this.s.batches.length == 0) {
throw toError("Invalid Operation, No operations in bulk");
if (this.s.batches.length == 0) {
var emptyBatchError = toError('Invalid Operation, no operations specified');
return (typeof callback === 'function') ?
callback(emptyBatchError, null) : this.s.promiseLibrary.reject(emptyBatchError);
}
// Execute using callback
if(typeof callback == 'function') return executeBatches(this, callback);
if (typeof callback === 'function') return executeBatches(this, callback);
// Return a Promise
return new this.s.promiseLibrary(function(resolve, reject) {
executeBatches(self, function(err, r) {
if(err) return reject(err);
if (err) return reject(err);
resolve(r);
});
});

View file

@ -46,6 +46,8 @@ var checkCollectionName = require('./utils').checkCollectionName
* });
*/
var mergeKeys = ['readPreference', 'ignoreUndefined'];
/**
* Create a new Collection instance (INTERNAL TYPE, do not instantiate directly)
* @class
@ -247,7 +249,7 @@ Collection.prototype.find = function() {
}
// Check special case where we are using an objectId
if(selector instanceof ObjectID || (selector != null && selector._bsontype == 'ObjectID')) {
if(selector != null && selector._bsontype == 'ObjectID') {
selector = {_id:selector};
}
@ -274,6 +276,14 @@ Collection.prototype.find = function() {
if (!options) options = {};
var newOptions = {};
// Make a shallow copy of the collection options
for(var key in this.s.options) {
if(mergeKeys.indexOf(key) != -1) {
newOptions[key] = this.s.options[key];
}
}
// Make a shallow copy of options
for (var key in options) {
newOptions[key] = options[key];
@ -406,7 +416,7 @@ Collection.prototype.insertOne = function(doc, options, callback) {
});
}
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -437,7 +447,7 @@ var insertOne = function(self, doc, options, callback) {
});
}
var mapInserManyResults = function(docs, r) {
var mapInsertManyResults = function(docs, r) {
var ids = r.getInsertedIds();
var keys = Object.keys(ids);
var finalIds = new Array(keys.length);
@ -478,13 +488,14 @@ define.classMethod('insertOne', {callback: true, promise:true});
* @param {boolean} [options.serializeFunctions=false] Serialize functions on any object.
* @param {boolean} [options.forceServerObjectId=false] Force server to assign _id values instead of driver.
* @param {boolean} [options.bypassDocumentValidation=false] Allow driver to bypass schema validation in MongoDB 3.2 or higher.
* @param {boolean} [options.ordered=true] If true, when an insert fails, don't execute the remaining writes. If false, continue with remaining inserts when one fails.
* @param {Collection~insertWriteOpCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
*/
Collection.prototype.insertMany = function(docs, options, callback) {
var self = this;
if(typeof options == 'function') callback = options, options = {};
options = options || {ordered:true};
options = options ? shallowClone(options) : {ordered:true};
if(!Array.isArray(docs) && typeof callback == 'function') {
return callback(MongoError.create({message: 'docs parameter must be an array of documents', driver:true }));
} else if(!Array.isArray(docs)) {
@ -521,14 +532,14 @@ Collection.prototype.insertMany = function(docs, options, callback) {
// Execute using callback
if(typeof callback == 'function') return bulkWrite(self, operations, options, function(err, r) {
if(err) return callback(err, r);
callback(null, mapInserManyResults(docs, r));
callback(null, mapInsertManyResults(docs, r));
});
// Return a Promise
return new this.s.promiseLibrary(function(resolve, reject) {
bulkWrite(self, operations, options, function(err, r) {
if(err) return reject(err);
resolve(mapInserManyResults(docs, r));
resolve(mapInsertManyResults(docs, r));
});
});
}
@ -609,7 +620,7 @@ Collection.prototype.bulkWrite = function(operations, options, callback) {
}
var bulkWrite = function(self, operations, options, callback) {
// Add ignoreUndfined
// Add ignoreUndefined
if(self.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = self.s.options.ignoreUndefined;
@ -724,7 +735,7 @@ var insertDocuments = function(self, docs, options, callback) {
// Add _id if not specified
if(forceServerObjectId !== true){
for(var i = 0; i < docs.length; i++) {
if(docs[i]._id == null) docs[i]._id = self.s.pkFactory.createPk();
if(docs[i]._id === void 0) docs[i]._id = self.s.pkFactory.createPk();
}
}
@ -866,7 +877,7 @@ Collection.prototype.updateOne = function(filter, update, options, callback) {
if(typeof options == 'function') callback = options, options = {};
options = shallowClone(options)
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -893,7 +904,7 @@ var updateOne = function(self, filter, update, options, callback) {
if(err && callback) return callback(err);
if(r == null) return callback(null, {result: {ok:1}});
r.modifiedCount = r.result.nModified != null ? r.result.nModified : r.result.n;
r.upsertedId = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? r.result.upserted[0]._id : null;
r.upsertedId = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? r.result.upserted[0] : null;
r.upsertedCount = Array.isArray(r.result.upserted) && r.result.upserted.length ? r.result.upserted.length : 0;
r.matchedCount = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? 0 : r.result.n;
if(callback) callback(null, r);
@ -921,7 +932,7 @@ Collection.prototype.replaceOne = function(filter, doc, options, callback) {
if(typeof options == 'function') callback = options, options = {};
options = shallowClone(options)
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -948,8 +959,9 @@ var replaceOne = function(self, filter, doc, options, callback) {
if(callback == null) return;
if(err && callback) return callback(err);
if(r == null) return callback(null, {result: {ok:1}});
r.modifiedCount = r.result.nModified != null ? r.result.nModified : r.result.n;
r.upsertedId = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? r.result.upserted[0]._id : null;
r.upsertedId = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? r.result.upserted[0] : null;
r.upsertedCount = Array.isArray(r.result.upserted) && r.result.upserted.length ? r.result.upserted.length : 0;
r.matchedCount = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? 0 : r.result.n;
r.ops = [doc];
@ -977,7 +989,7 @@ Collection.prototype.updateMany = function(filter, update, options, callback) {
if(typeof options == 'function') callback = options, options = {};
options = shallowClone(options)
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -1004,7 +1016,7 @@ var updateMany = function(self, filter, update, options, callback) {
if(err && callback) return callback(err);
if(r == null) return callback(null, {result: {ok:1}});
r.modifiedCount = r.result.nModified != null ? r.result.nModified : r.result.n;
r.upsertedId = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? r.result.upserted[0]._id : null;
r.upsertedId = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? r.result.upserted[0] : null;
r.upsertedCount = Array.isArray(r.result.upserted) && r.result.upserted.length ? r.result.upserted.length : 0;
r.matchedCount = Array.isArray(r.result.upserted) && r.result.upserted.length > 0 ? 0 : r.result.n;
if(callback) callback(null, r);
@ -1032,8 +1044,8 @@ var updateDocuments = function(self, selector, document, options, callback) {
// Execute the operation
var op = {q: selector, u: document};
op.upsert = typeof options.upsert == 'boolean' ? options.upsert : false;
op.multi = typeof options.multi == 'boolean' ? options.multi : false;
op.upsert = options.upsert !== void 0 ? !!options.upsert : false;
op.multi = options.multi !== void 0 ? !!options.multi : false;
// Have we specified collation
decorateWithCollation(finalOptions, self, options);
@ -1071,7 +1083,7 @@ var updateDocuments = function(self, selector, document, options, callback) {
Collection.prototype.update = function(selector, document, options, callback) {
var self = this;
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -1123,7 +1135,7 @@ Collection.prototype.deleteOne = function(filter, options, callback) {
if(typeof options == 'function') callback = options, options = {};
options = shallowClone(options);
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -1174,7 +1186,7 @@ Collection.prototype.deleteMany = function(filter, options, callback) {
if(typeof options == 'function') callback = options, options = {};
options = shallowClone(options);
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -1263,7 +1275,7 @@ define.classMethod('removeMany', {callback: true, promise:true});
Collection.prototype.remove = function(selector, options, callback) {
var self = this;
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -1301,7 +1313,7 @@ Collection.prototype.save = function(doc, options, callback) {
if(typeof options == 'function') callback = options, options = {};
options = options || {};
// Add ignoreUndfined
// Add ignoreUndefined
if(this.s.options.ignoreUndefined) {
options = shallowClone(options);
options.ignoreUndefined = this.s.options.ignoreUndefined;
@ -1373,7 +1385,7 @@ define.classMethod('save', {callback: true, promise:true});
* @param {boolean} [options.promoteBuffers=false] Promotes Binary BSON values to native Node Buffers.
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
* @param {boolean} [options.partial=false] Specify if the cursor should return partial results when querying against a sharded system
* @param {number} [options.maxTimeMS=null] Number of miliseconds to wait before aborting the query.
* @param {number} [options.maxTimeMS=null] Number of milliseconds to wait before aborting the query.
* @param {object} [options.collation=null] Specify collation (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields).
* @param {Collection~resultCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
@ -1785,9 +1797,6 @@ var reIndex = function(self, options, callback) {
// Reindex
var cmd = {'reIndex':self.s.name};
// Decorate command with writeConcern if supported
decorateWithWriteConcern(cmd, self, options);
// Execute the command
self.s.db.command(cmd, options, function(err, result) {
if(callback == null) return;
@ -1987,7 +1996,7 @@ define.classMethod('indexInformation', {callback: true, promise:true});
* @param {boolean} [options.skip=null] The number of documents to skip for the count.
* @param {string} [options.hint=null] An index name hint for the query.
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
* @param {number} [options.maxTimeMS=null] Number of miliseconds to wait before aborting the query.
* @param {number} [options.maxTimeMS=null] Number of milliseconds to wait before aborting the query.
* @param {Collection~countCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
*/
@ -2030,7 +2039,7 @@ var count = function(self, query, options, callback) {
if(typeof skip == 'number') cmd.skip = skip;
if(typeof limit == 'number') cmd.limit = limit;
if(typeof maxTimeMS == 'number') cmd.maxTimeMS = maxTimeMS;
if(hint) options.hint = hint;
if(hint) cmd.hint = hint;
options = shallowClone(options);
// Ensure we have the right read preference inheritance
@ -2060,7 +2069,7 @@ define.classMethod('count', {callback: true, promise:true});
* @param {object} query The query for filtering the set of documents to which we apply the distinct filter.
* @param {object} [options=null] Optional settings.
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
* @param {number} [options.maxTimeMS=null] Number of miliseconds to wait before aborting the query.
* @param {number} [options.maxTimeMS=null] Number of milliseconds to wait before aborting the query.
* @param {Collection~resultCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
*/
@ -2362,8 +2371,8 @@ var findOneAndUpdate = function(self, filter, update, options, callback) {
var finalOptions = shallowClone(options);
finalOptions['fields'] = options.projection;
finalOptions['update'] = true;
finalOptions['new'] = typeof options.returnOriginal == 'boolean' ? !options.returnOriginal : false;
finalOptions['upsert'] = typeof options.upsert == 'boolean' ? options.upsert : false;
finalOptions['new'] = options.returnOriginal !== void 0 ? !options.returnOriginal : false;
finalOptions['upsert'] = options.upsert !== void 0 ? !!options.upsert : false;
// Execute findAndModify
self.findAndModify(
@ -2646,13 +2655,18 @@ Collection.prototype.aggregate = function(pipeline, options, callback) {
options = getReadPreference(this, options, this.s.db, this);
// If explain has been specified add it
if(options.explain) command.explain = options.explain;
if (options.explain) command.explain = options.explain;
// Validate that cursor options is valid
if(options.cursor != null && typeof options.cursor != 'object') {
throw toError('cursor options must be an object');
}
if (this.s.topology.capabilities().hasAggregationCursor) {
options.cursor = options.cursor || { batchSize : 1000 };
command.cursor = options.cursor;
}
// promiseLibrary
options.promiseLibrary = this.s.promiseLibrary;
@ -2663,11 +2677,6 @@ Collection.prototype.aggregate = function(pipeline, options, callback) {
throw new MongoError('cannot connect to server');
}
if(this.s.topology.capabilities().hasAggregationCursor) {
options.cursor = options.cursor || { batchSize : 1000 };
command.cursor = options.cursor;
}
// Allow disk usage command
if(typeof options.allowDiskUse == 'boolean') command.allowDiskUse = options.allowDiskUse;
if(typeof options.maxTimeMS == 'number') command.maxTimeMS = options.maxTimeMS;
@ -2676,12 +2685,18 @@ Collection.prototype.aggregate = function(pipeline, options, callback) {
return this.s.topology.cursor(this.s.namespace, command, options);
}
// We do not allow cursor
if(options.cursor) {
return this.s.topology.cursor(this.s.namespace, command, options);
if (options.cursor) {
var cursor = this.s.topology.cursor(this.s.namespace, command, options);
return cursor.toArray(function(err, result) {
if (err) {
return handleCallback(callback, err);
}
handleCallback(callback, null, result);
});
}
// Execute the command
// For legacy server versions, we execute the command and format the result
this.s.db.command(command, options, function(err, result) {
if(err) {
handleCallback(callback, err);
@ -2978,6 +2993,7 @@ var groupFunction = 'function () {\nvar c = db[ns].find(condition);\nvar map = n
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
* @param {Collection~resultCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
* @deprecated MongoDB 3.6 or higher will no longer support the group command. We recommend rewriting using the aggregation framework.
*/
Collection.prototype.group = function(keys, condition, initial, reduce, finalize, command, options, callback) {
var self = this;
@ -2996,7 +3012,7 @@ Collection.prototype.group = function(keys, condition, initial, reduce, finalize
finalize = null;
}
if (!Array.isArray(keys) && keys instanceof Object && typeof(keys) !== 'function' && !(keys instanceof Code)) {
if (!Array.isArray(keys) && keys instanceof Object && typeof(keys) !== 'function' && !(keys._bsontype == 'Code')) {
keys = Object.keys(keys);
}
@ -3025,7 +3041,7 @@ Collection.prototype.group = function(keys, condition, initial, reduce, finalize
var group = function(self, keys, condition, initial, reduce, finalize, command, options, callback) {
// Execute using the command
if(command) {
var reduceFunction = reduce instanceof Code
var reduceFunction = reduce && reduce._bsontype == 'Code'
? reduce
: new Code(reduce);
@ -3042,8 +3058,8 @@ var group = function(self, keys, condition, initial, reduce, finalize, command,
// if finalize is defined
if(finalize != null) selector.group['finalize'] = finalize;
// Set up group selector
if ('function' === typeof keys || keys instanceof Code) {
selector.group.$keyf = keys instanceof Code
if ('function' === typeof keys || (keys && keys._bsontype == 'Code')) {
selector.group.$keyf = keys && keys._bsontype == 'Code'
? keys
: new Code(keys);
} else {
@ -3073,7 +3089,7 @@ var group = function(self, keys, condition, initial, reduce, finalize, command,
});
} else {
// Create execution scope
var scope = reduce != null && reduce instanceof Code
var scope = reduce != null && reduce._bsontype == 'Code'
? reduce.scope
: {};
@ -3100,7 +3116,7 @@ define.classMethod('group', {callback: true, promise:true});
* @ignore
*/
function processScope (scope) {
if(!isObject(scope) || scope instanceof ObjectID) {
if(!isObject(scope) || scope._bsontype == 'ObjectID') {
return scope;
}
@ -3183,16 +3199,23 @@ var mapReduce = function(self, map, reduce, options, callback) {
, reduce: reduce
};
// Exclusion list
var exclusionList = ['readPreference'];
// Add any other options passed in
for(var n in options) {
if('scope' == n) {
mapCommandHash[n] = processScope(options[n]);
} else {
mapCommandHash[n] = options[n];
// Only include if not in exclusion list
if(exclusionList.indexOf(n) == -1) {
mapCommandHash[n] = options[n];
}
}
}
options = shallowClone(options);
// Ensure we have the right read preference inheritance
options = getReadPreference(self, options, self.s.db, self);
@ -3348,6 +3371,7 @@ var testForFields = {
, numberOfRetries: 1, awaitdata: 1, awaitData: 1, exhaust: 1, batchSize: 1, returnKey: 1, maxScan: 1, min: 1, max: 1, showDiskLoc: 1
, comment: 1, raw: 1, readPreference: 1, partial: 1, read: 1, dbName: 1, oplogReplay: 1, connection: 1, maxTimeMS: 1, transforms: 1
, collation: 1
, noCursorTimeout: 1
}
module.exports = Collection;

View file

@ -84,7 +84,7 @@ var CommandCursor = function(bson, ns, cmd, options, topology, topologyOptions)
, streamOptions: streamOptions
// BSON
, bson: bson
// Namespae
// Namespace
, ns: ns
// Command
, cmd: cmd
@ -131,8 +131,8 @@ var CommandCursor = function(bson, ns, cmd, options, topology, topologyOptions)
inherits(CommandCursor, Readable);
// Set the methods to inherit from prototype
var methodsToInherit = ['_next', 'next', 'each', 'forEach', 'toArray'
, 'rewind', 'bufferedCount', 'readBufferedDocuments', 'close', 'isClosed', 'kill'
var methodsToInherit = ['_next', 'next', 'hasNext', 'each', 'forEach', 'toArray'
, 'rewind', 'bufferedCount', 'readBufferedDocuments', 'close', 'isClosed', 'kill', 'setCursorBatchSize'
, '_find', '_getmore', '_killcursor', 'isDead', 'explain', 'isNotified', 'isKilled'];
// Only inherit the types we need
@ -207,6 +207,7 @@ define.classMethod('toArray', {callback: true, promise:true});
define.classMethod('each', {callback: true, promise:false});
define.classMethod('forEach', {callback: true, promise:false});
define.classMethod('next', {callback: true, promise:true});
define.classMethod('hasNext', {callback: true, promise:true});
define.classMethod('close', {callback: true, promise:true});
define.classMethod('isClosed', {callback: false, promise:false, returns: [Boolean]});
define.classMethod('rewind', {callback: false, promise:false});
@ -221,6 +222,14 @@ define.classMethod('readBufferedDocuments', {callback: false, promise:false, ret
* @return {Promise} returns Promise if no callback passed
*/
/**
* Check if there is any document still available in the cursor
* @function CommandCursor.prototype.hasNext
* @param {CommandCursor~resultCallback} [callback] The result callback.
* @throws {MongoError}
* @return {Promise} returns Promise if no callback passed
*/
/**
* The callback format for results
* @callback CommandCursor~toArrayResultCallback
@ -231,7 +240,7 @@ define.classMethod('readBufferedDocuments', {callback: false, promise:false, ret
/**
* Returns an array of documents. The caller is responsible for making sure that there
* is enough memory to store the results. Note that the array only contain partial
* results when this cursor had been previouly accessed.
* results when this cursor had been previously accessed.
* @method CommandCursor.prototype.toArray
* @param {CommandCursor~toArrayResultCallback} [callback] The result callback.
* @throws {MongoError}
@ -247,7 +256,7 @@ define.classMethod('readBufferedDocuments', {callback: false, promise:false, ret
/**
* Iterates over all the documents for this cursor. As with **{cursor.toArray}**,
* not all of the elements will be iterated if this cursor had been previouly accessed.
* not all of the elements will be iterated if this cursor had been previously accessed.
* In that case, **{cursor.rewind}** can be used to reset the cursor. However, unlike
* **{cursor.toArray}**, the cursor will only hold a maximum of batch size elements
* at any given time if batch size is specified. Otherwise, the caller is responsible

View file

@ -155,6 +155,14 @@ var Cursor = function(bson, ns, cmd, options, topology, topologyOptions) {
// Set the sort value
this.sortValue = self.s.cmd.sort;
// Get the batchSize
var batchSize = cmd.cursor && cmd.cursor.batchSize
? cmd.cursor && cmd.cursor.batchSize
: (options.cursor && options.cursor.batchSize ? options.cursor.batchSize : 1000);
// Set the batchSize
this.setCursorBatchSize(batchSize);
}
/**
@ -213,7 +221,8 @@ Cursor.prototype.hasNext = function(callback) {
return callback(null, true);
} else {
return nextObject(self, function(err, doc) {
if(!doc) return callback(null, false);
if (err) return callback(err, null);
if (!doc) return callback(null, false);
self.s.currentDoc = doc;
callback(null, true);
});
@ -694,7 +703,7 @@ define.classMethod('next', {callback: true, promise:true});
/**
* Iterates over all the documents for this cursor. As with **{cursor.toArray}**,
* not all of the elements will be iterated if this cursor had been previouly accessed.
* not all of the elements will be iterated if this cursor had been previously accessed.
* In that case, **{cursor.rewind}** can be used to reset the cursor. However, unlike
* **{cursor.toArray}**, the cursor will only hold a maximum of batch size elements
* at any given time if batch size is specified. Otherwise, the caller is responsible
@ -813,7 +822,7 @@ define.classMethod('setReadPreference', {callback: false, promise:false, returns
/**
* Returns an array of documents. The caller is responsible for making sure that there
* is enough memory to store the results. Note that the array only contain partial
* results when this cursor had been previouly accessed. In that case,
* results when this cursor had been previously accessed. In that case,
* cursor.rewind() can be used to reset the cursor.
* @method
* @param {Cursor~toArrayResultCallback} [callback] The result callback.
@ -891,7 +900,7 @@ define.classMethod('toArray', {callback: true, promise:true});
* @param {object} [options=null] Optional settings.
* @param {number} [options.skip=null] The number of documents to skip.
* @param {number} [options.limit=null] The maximum amounts to count before aborting.
* @param {number} [options.maxTimeMS=null] Number of miliseconds to wait before aborting the query.
* @param {number} [options.maxTimeMS=null] Number of milliseconds to wait before aborting the query.
* @param {string} [options.hint=null] An index name hint for the query.
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
* @param {Cursor~countResultCallback} [callback] The result callback.
@ -933,6 +942,16 @@ var count = function(self, applySkipLimit, opts, callback) {
'count': self.s.ns.substr(delimiter+1), 'query': self.s.cmd.query
}
// Apply a readConcern if set
if(self.s.cmd.readConcern) {
command.readConcern = self.s.cmd.readConcern;
}
// Apply a hint if set
if(self.s.cmd.hint) {
command.hint = self.s.cmd.hint;
}
if(typeof opts.maxTimeMS == 'number') {
command.maxTimeMS = opts.maxTimeMS;
} else if(self.s.cmd && typeof self.s.cmd.maxTimeMS == 'number') {
@ -944,11 +963,14 @@ var count = function(self, applySkipLimit, opts, callback) {
if(opts.limit) command.limit = opts.limit;
if(self.s.options.hint) command.hint = self.s.options.hint;
// Set cursor server to the same as the topology
self.server = self.topology;
// Execute the command
self.topology.command(f("%s.$cmd", self.s.ns.substr(0, delimiter))
, command, function(err, result) {
callback(err, result ? result.result.n : null)
});
}, self.options);
}
define.classMethod('count', {callback: true, promise:true});
@ -982,7 +1004,12 @@ define.classMethod('close', {callback: true, promise:true});
* @return {Cursor}
*/
Cursor.prototype.map = function(transform) {
this.cursorState.transforms = { doc: transform };
if(this.cursorState.transforms && this.cursorState.transforms.doc) {
var oldTransform = this.cursorState.transforms.doc;
this.cursorState.transforms.doc = function (doc) { return transform(oldTransform(doc)); };
} else {
this.cursorState.transforms = { doc: transform };
}
return this;
}

158
node/node_modules/mongodb/lib/db.js generated vendored
View file

@ -1,6 +1,7 @@
"use strict";
var EventEmitter = require('events').EventEmitter
, authenticate = require('./authenticate')
, inherits = require('util').inherits
, getSingleProperty = require('./utils').getSingleProperty
, shallowClone = require('./utils').shallowClone
@ -21,6 +22,7 @@ var EventEmitter = require('events').EventEmitter
, Logger = require('mongodb-core').Logger
, Collection = require('./collection')
, crypto = require('crypto')
, mergeOptionsAndWriteConcern = require('./utils').mergeOptionsAndWriteConcern
, assign = require('./utils').assign;
var debugFields = ['authSource', 'w', 'wtimeout', 'j', 'native_parser', 'forceServerObjectId'
@ -171,6 +173,7 @@ var Db = function(databaseName, topology, options) {
topology.once('fullsetup', createListener(self, 'fullsetup', self));
topology.once('all', createListener(self, 'all', self));
topology.on('reconnect', createListener(self, 'reconnect', self));
topology.on('reconnectFailed', createListener(self, 'reconnectFailed', self));
}
inherits(Db, EventEmitter);
@ -411,6 +414,9 @@ define.classMethod('admin', {callback: false, promise:false, returns: [Admin]});
* @param {Collection} collection The collection instance.
*/
var collectionKeys = ['pkFactory', 'readPreference'
, 'serializeFunctions', 'strict', 'readConcern', 'ignoreUndefined', 'promoteValues', 'promoteBuffers', 'promoteLongs'];
/**
* Fetch a specific collection (containing the actual collection information). If the application does not use strict mode you can
* can use it without a callback in the following way: `var collection = db.collection('mycollection');`
@ -447,6 +453,9 @@ Db.prototype.collection = function(name, options, callback) {
options.ignoreUndefined = this.s.options.ignoreUndefined;
}
// Merge in all needed options and ensure correct writeConcern merging from db level
options = mergeOptionsAndWriteConcern(options, this.s.options, collectionKeys, true);
// Execute
if(options == null || !options.strict) {
try {
@ -454,6 +463,7 @@ Db.prototype.collection = function(name, options, callback) {
if(callback) callback(null, collection);
return collection;
} catch(err) {
// if(err instanceof MongoError && callback) return callback(err);
if(callback) return callback(err);
throw err;
}
@ -470,7 +480,7 @@ Db.prototype.collection = function(name, options, callback) {
}
// Strict mode
this.listCollections({name:name}).toArray(function(err, collections) {
this.listCollections({name:name}, options).toArray(function(err, collections) {
if(err != null) return handleCallback(callback, err, null);
if(collections.length == 0) return handleCallback(callback, toError(f("Collection %s does not exist. Currently in strict mode.", name)), null);
@ -539,6 +549,7 @@ var createCollection = function(self, name, options, callback) {
/**
* Create a new collection on a server with the specified options. Use this to create capped collections.
* More information about command options available at https://docs.mongodb.com/manual/reference/command/create/
*
* @method
* @param {string} name the collection name we wish to access.
@ -552,9 +563,17 @@ var createCollection = function(self, name, options, callback) {
* @param {boolean} [options.serializeFunctions=false] Serialize functions on any object.
* @param {boolean} [options.strict=false] Returns an error if the collection does not exist
* @param {boolean} [options.capped=false] Create a capped collection.
* @param {boolean} [options.autoIndexId=true] Create an index on the _id field of the document, True by default on MongoDB 2.2 or higher off for version < 2.2.
* @param {number} [options.size=null] The size of the capped collection in bytes.
* @param {number} [options.max=null] The maximum number of documents in the capped collection.
* @param {boolean} [options.autoIndexId=true] Create an index on the _id field of the document, True by default on MongoDB 2.2 or higher off for version < 2.2.
* @param {number} [options.flags=null] Optional. Available for the MMAPv1 storage engine only to set the usePowerOf2Sizes and the noPadding flag.
* @param {object} [options.storageEngine=null] Allows users to specify configuration to the storage engine on a per-collection basis when creating a collection on MongoDB 3.0 or higher.
* @param {object} [options.validator=null] Allows users to specify validation rules or expressions for the collection. For more information, see Document Validation on MongoDB 3.2 or higher.
* @param {string} [options.validationLevel=null] Determines how strictly MongoDB applies the validation rules to existing documents during an update on MongoDB 3.2 or higher.
* @param {string} [options.validationAction=null] Determines whether to error on invalid documents or just warn about the violations but allow invalid documents to be inserted on MongoDB 3.2 or higher.
* @param {object} [options.indexOptionDefaults=null] Allows users to specify a default configuration for indexes when creating a collection on MongoDB 3.2 or higher.
* @param {string} [options.viewOn=null] The name of the source collection or view from which to create the view. The name is not the full namespace of the collection or view; i.e. does not include the database name and implies the same database as the view to create on MongoDB 3.4 or higher.
* @param {array} [options.pipeline=null] An array that consists of the aggregation pipeline stage. create creates the view by applying the specified pipeline to the viewOn collection or view on MongoDB 3.4 or higher.
* @param {object} [options.collation=null] Specify collation (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields).
* @param {Db~collectionResultCallback} [callback] The results callback
* @return {Promise} returns Promise if no callback passed
@ -634,7 +653,7 @@ var listCollectionsTranforms = function(databaseName) {
* Get the list of all collection information for the specified db.
*
* @method
* @param {object} filter Query to filter collections by
* @param {object} [filter={}] Query to filter collections by
* @param {object} [options=null] Optional settings.
* @param {number} [options.batchSize=null] The batchSize for the returned command cursor or if pre 2.8 the systems batch collection
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
@ -652,6 +671,8 @@ Db.prototype.listCollections = function(filter, options) {
// Ensure valid readPreference
if(options.readPreference) {
options.readPreference = convertReadPreference(options.readPreference);
} else {
options.readPreference = this.s.readPreference || CoreReadPreference.primary;
}
// We have a list collections command
@ -715,7 +736,7 @@ var evaluate = function(self, code, parameters, options, callback) {
if(self.serverConfig && self.serverConfig.isDestroyed()) return callback(new MongoError('topology was destroyed'));
// If not a code object translate to one
if(!(finalCode instanceof Code)) finalCode = new Code(finalCode);
if(!(finalCode && finalCode._bsontype == 'Code')) finalCode = new Code(finalCode);
// Ensure the parameters are correct
if(parameters != null && !Array.isArray(parameters) && typeof parameters !== 'function') {
finalParameters = [parameters];
@ -749,7 +770,7 @@ var evaluate = function(self, code, parameters, options, callback) {
* @param {Code} code JavaScript to execute on server.
* @param {(object|array)} parameters The parameters for the call.
* @param {object} [options=null] Optional settings.
* @param {boolean} [options.nolock=false] Tell MongoDB not to block on the evaulation of the javascript.
* @param {boolean} [options.nolock=false] Tell MongoDB not to block on the evaluation of the javascript.
* @param {Db~resultCallback} [callback] The results callback
* @deprecated Eval is deprecated on MongoDB 3.2 and forward
* @return {Promise} returns Promise if no callback passed
@ -921,7 +942,7 @@ var collections = function(self, callback) {
// Return the collection objects
handleCallback(callback, null, documents.map(function(d) {
return new Collection(self, self.s.topology, self.s.databaseName, d.name.replace(self.s.databaseName + ".", ''), self.s.pkFactory, self.s.options);
return new Collection(self, self.s.topology, self.s.databaseName, d.name, self.s.pkFactory, self.s.options);
}));
});
}
@ -1023,8 +1044,6 @@ Db.prototype.createIndex = function(name, fieldOrSpec, options, callback) {
options = options == null ? {} : options;
// Shallow clone the options
options = shallowClone(options);
// Run only against primary
options.readPreference = ReadPreference.PRIMARY;
// If we have a callback fallback
if(typeof callback == 'function') return createIndex(self, name, fieldOrSpec, options, callback);
@ -1039,12 +1058,15 @@ Db.prototype.createIndex = function(name, fieldOrSpec, options, callback) {
var createIndex = function(self, name, fieldOrSpec, options, callback) {
// Get the write concern options
var finalOptions = writeConcern({}, self, options);
var finalOptions = writeConcern({}, self, options, { readPreference: ReadPreference.PRIMARY });
// Ensure we have a callback
if(finalOptions.writeConcern && typeof callback != 'function') {
throw MongoError.create({message: "Cannot use a writeConcern without a provided callback", driver:true});
}
// Run only against primary
options.readPreference = ReadPreference.PRIMARY;
// Did the user destroy the topology
if(self.serverConfig && self.serverConfig.isDestroyed()) return callback(new MongoError('topology was destroyed'));
@ -1055,9 +1077,10 @@ var createIndex = function(self, name, fieldOrSpec, options, callback) {
// 67 = 'CannotCreateIndex' (malformed index options)
// 85 = 'IndexOptionsConflict' (index already exists with different options)
// 11000 = 'DuplicateKey' (couldn't build unique index because of dupes)
// 11600 = 'InterruptedAtShutdown' (interrupted at shutdown)
// These errors mean that the server recognized `createIndex` as a command
// and so we don't need to fallback to an insert.
if(err.code === 67 || err.code == 11000 || err.code === 85) {
if(err.code === 67 || err.code == 11000 || err.code === 85 || err.code == 11600) {
return handleCallback(callback, err, result);
}
@ -1127,7 +1150,10 @@ var ensureIndex = function(self, name, fieldOrSpec, options, callback) {
// Did the user destroy the topology
if(self.serverConfig && self.serverConfig.isDestroyed()) return callback(new MongoError('topology was destroyed'));
// Check if the index allready exists
// Merge primary readPreference
finalOptions.readPreference = ReadPreference.PRIMARY
// Check if the index already exists
self.indexInformation(name, finalOptions, function(err, indexInformation) {
if(err != null && err.code != 26) return handleCallback(callback, err, null);
// If the index does not exist, create it
@ -1448,74 +1474,6 @@ Db.prototype.removeUser = function(username, options, callback) {
});
};
var authenticate = function(self, username, password, options, callback) {
// Did the user destroy the topology
if(self.serverConfig && self.serverConfig.isDestroyed()) return callback(new MongoError('topology was destroyed'));
// the default db to authenticate against is 'self'
// if authententicate is called from a retry context, it may be another one, like admin
var authdb = options.dbName ? options.dbName : self.databaseName;
authdb = self.authSource ? self.authSource : authdb;
authdb = options.authdb ? options.authdb : authdb;
authdb = options.authSource ? options.authSource : authdb;
// Callback
var _callback = function(err, result) {
if(self.listeners('authenticated').length > 0) {
self.emit('authenticated', err, result);
}
// Return to caller
handleCallback(callback, err, result);
}
// authMechanism
var authMechanism = options.authMechanism || '';
authMechanism = authMechanism.toUpperCase();
// If classic auth delegate to auth command
if(authMechanism == 'MONGODB-CR') {
self.s.topology.auth('mongocr', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'PLAIN') {
self.s.topology.auth('plain', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'MONGODB-X509') {
self.s.topology.auth('x509', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'SCRAM-SHA-1') {
self.s.topology.auth('scram-sha-1', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else if(authMechanism == 'GSSAPI') {
if(process.platform == 'win32') {
self.s.topology.auth('sspi', authdb, username, password, options, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else {
self.s.topology.auth('gssapi', authdb, username, password, options, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
}
} else if(authMechanism == 'DEFAULT') {
self.s.topology.auth('default', authdb, username, password, function(err) {
if(err) return handleCallback(callback, err, false);
_callback(null, true);
});
} else {
handleCallback(callback, MongoError.create({message: f("authentication mechanism %s not supported", options.authMechanism), driver:true}));
}
}
/**
* Authenticate a user against the server.
* @method
@ -1525,44 +1483,11 @@ var authenticate = function(self, username, password, options, callback) {
* @param {string} [options.authMechanism=MONGODB-CR] The authentication mechanism to use, GSSAPI, MONGODB-CR, MONGODB-X509, PLAIN
* @param {Db~resultCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
* @deprecated This method will no longer be available in the next major release 3.x as MongoDB 3.6 will only allow auth against users in the admin db and will no longer allow multiple credentials on a socket. Please authenticate using MongoClient.connect with auth credentials.
*/
Db.prototype.authenticate = function(username, password, options, callback) {
if(typeof options == 'function') callback = options, options = {};
var self = this;
// Shallow copy the options
options = shallowClone(options);
// Set default mechanism
if(!options.authMechanism) {
options.authMechanism = 'DEFAULT';
} else if(options.authMechanism != 'GSSAPI'
&& options.authMechanism != 'DEFAULT'
&& options.authMechanism != 'MONGODB-CR'
&& options.authMechanism != 'MONGODB-X509'
&& options.authMechanism != 'SCRAM-SHA-1'
&& options.authMechanism != 'PLAIN') {
return handleCallback(callback, MongoError.create({message: "only DEFAULT, GSSAPI, PLAIN, MONGODB-X509, SCRAM-SHA-1 or MONGODB-CR is supported by authMechanism", driver:true}));
}
// If we have a callback fallback
if(typeof callback == 'function') return authenticate(self, username, password, options, function(err, r) {
// Support failed auth method
if(err && err.message && err.message.indexOf('saslStart') != -1) err.code = 59;
// Reject error
if(err) return callback(err, r);
callback(null, r);
});
// Return a promise
return new this.s.promiseLibrary(function(resolve, reject) {
authenticate(self, username, password, options, function(err, r) {
// Support failed auth method
if(err && err.message && err.message.indexOf('saslStart') != -1) err.code = 59;
// Reject error
if(err) return reject(err);
resolve(r);
});
});
console.warn("Db.prototype.authenticate method will no longer be available in the next major release 3.x as MongoDB 3.6 will only allow auth against users in the admin db and will no longer allow multiple credentials on a socket. Please authenticate using MongoClient.connect with auth credentials.");
return authenticate.apply(this, [this].concat(Array.prototype.slice.call(arguments)));
};
define.classMethod('authenticate', {callback: true, promise:true});
@ -1636,7 +1561,6 @@ var indexInformation = function(self, name, options, callback) {
// Did the user destroy the topology
if(self.serverConfig && self.serverConfig.isDestroyed()) return callback(new MongoError('topology was destroyed'));
// Process all the results from the index command and collection
var processResults = function(indexes) {
// Contains all the information
@ -1655,7 +1579,7 @@ var indexInformation = function(self, name, options, callback) {
}
// Get the list of indexes of the specified collection
self.collection(name).listIndexes().toArray(function(err, indexes) {
self.collection(name).listIndexes(options).toArray(function(err, indexes) {
if(err) return callback(toError(err));
if(!Array.isArray(indexes)) return handleCallback(callback, null, []);
if(full) return handleCallback(callback, null, indexes);

View file

@ -209,26 +209,27 @@ function doRead(_this) {
return __handleError(_this, new Error(errmsg));
}
if (doc.data.length() !== expectedLength) {
var buf = Buffer.isBuffer(doc.data) ? doc.data : doc.data.buffer;
if (buf.length !== expectedLength) {
if (bytesRemaining <= 0) {
errmsg = 'ExtraChunk: Got unexpected n: ' + doc.n;
return __handleError(_this, new Error(errmsg));
}
errmsg = 'ChunkIsWrongSize: Got unexpected length: ' +
doc.data.length() + ', expected: ' + expectedLength;
buf.length + ', expected: ' + expectedLength;
return __handleError(_this, new Error(errmsg));
}
_this.s.bytesRead += doc.data.length();
_this.s.bytesRead += buf.length;
if (doc.data.buffer.length === 0) {
if (buf.length === 0) {
return _this.push(null);
}
var sliceStart = null;
var sliceEnd = null;
var buf = doc.data.buffer;
if (_this.s.bytesToSkip != null) {
sliceStart = _this.s.bytesToSkip;
@ -241,7 +242,7 @@ function doRead(_this) {
// If the remaining amount of data left is < chunkSize read the right amount of data
if (_this.s.options.end && (
(_this.s.options.end - _this.s.bytesToSkip) < doc.data.length()
(_this.s.options.end - _this.s.bytesToSkip) < buf.length
)) {
sliceEnd = (_this.s.options.end - _this.s.bytesToSkip);
}
@ -298,15 +299,27 @@ function init(self) {
return;
}
self.s.cursor = self.s.chunks.find({ files_id: doc._id }).sort({ n: 1 });
self.s.bytesToSkip = handleStartOption(self, doc, self.s.options);
var filter = { files_id: doc._id };
// Currently (MongoDB 3.4.4) skip function does not support the index,
// it needs to retrieve all the documents first and then skip them. (CS-25811)
// As work around we use $gte on the "n" field.
if (self.s.options && self.s.options.start != null){
var skip = Math.floor(self.s.options.start / doc.chunkSize);
if (skip > 0){
filter["n"] = {"$gte": skip};
}
}
self.s.cursor = self.s.chunks.find(filter).sort({ n: 1 });
if (self.s.readPreference) {
self.s.cursor.setReadPreference(self.s.readPreference);
}
self.s.expectedEnd = Math.ceil(doc.length / doc.chunkSize);
self.s.file = doc;
self.s.bytesToSkip = handleStartOption(self, doc, self.s.cursor,
self.s.options);
self.s.bytesToTrim = handleEndOption(self, doc, self.s.cursor,
self.s.options);
self.emit('file', doc);
@ -336,11 +349,11 @@ function waitForFile(_this, callback) {
* @ignore
*/
function handleStartOption(stream, doc, cursor, options) {
function handleStartOption(stream, doc, options) {
if (options && options.start != null) {
if (options.start > doc.length) {
throw new Error('Stream start (' + options.start + ') must not be ' +
'more than the length of the file (' + doc.length +')')
'more than the length of the file (' + doc.length +')');
}
if (options.start < 0) {
throw new Error('Stream start (' + options.start + ') must not be ' +
@ -351,8 +364,6 @@ function handleStartOption(stream, doc, cursor, options) {
'greater than stream end (' + options.end + ')');
}
cursor.skip(Math.floor(options.start / doc.chunkSize));
stream.s.bytesRead = Math.floor(options.start / doc.chunkSize) *
doc.chunkSize;
stream.s.expected = Math.floor(options.start / doc.chunkSize);

View file

@ -33,6 +33,8 @@ function GridFSBucketWriteStream(bucket, filename, options) {
this.filename = filename;
this.files = bucket.s._filesCollection;
this.options = options;
// Signals the write is all done
this.done = false;
this.id = options.id ? options.id : core.BSON.ObjectId();
this.chunkSizeBytes = this.options.chunkSizeBytes;
@ -252,9 +254,13 @@ function checkChunksIndex(_this, callback) {
*/
function checkDone(_this, callback) {
if(_this.done) return true;
if (_this.state.streamEnd &&
_this.state.outstandingRequests === 0 &&
!_this.state.errored) {
// Set done so we dont' trigger duplicate createFilesDoc
_this.done = true;
// Create a new files doc
var filesDoc = createFilesDoc(_this.id, _this.length, _this.chunkSizeBytes,
_this.md5.digest('hex'), _this.filename, _this.options.contentType,
_this.options.aliases, _this.options.metadata);
@ -421,6 +427,7 @@ function doWrite(_this, chunk, encoding, callback) {
}
--_this.state.outstandingRequests;
--outstandingRequests;
if (!outstandingRequests) {
_this.emit('drain', doc);
callback && callback();

View file

@ -61,7 +61,7 @@ var REFERENCE_BY_FILENAME = 0,
*
* Modes
* - **"r"** - read only. This is the default mode.
* - **"w"** - write in truncate mode. Existing data will be overwriten.
* - **"w"** - write in truncate mode. Existing data will be overwritten.
*
* @class
* @param {Db} db A database instance to interact with.
@ -101,7 +101,7 @@ var GridStore = function GridStore(db, id, filename, mode, options) {
filename = undefined;
}
if(id instanceof ObjectID) {
if(id && id._bsontype == 'ObjectID') {
this.referenceBy = REFERENCE_BY_ID;
this.fileId = id;
this.filename = filename;
@ -427,20 +427,23 @@ var writeFile = function(self, file, callback) {
// Write a chunk
var writeChunk = function() {
fs.read(file, self.chunkSize, offset, 'binary', function(err, data, bytesRead) {
// Allocate the buffer
var _buffer = new Buffer(self.chunkSize);
// Read the file
fs.read(file, _buffer, 0, _buffer.length, offset, function(err, bytesRead, data) {
if(err) return callback(err, self);
offset = offset + bytesRead;
// Create a new chunk for the data
var chunk = new Chunk(self, {n:index++}, self.writeConcern);
chunk.write(data, function(err, chunk) {
chunk.write(data.slice(0, bytesRead), function(err, chunk) {
if(err) return callback(err, self);
chunk.save({}, function(err) {
if(err) return callback(err, self);
self.position = self.position + data.length;
self.position = self.position + bytesRead;
// Point to current chunk
self.currentChunk = chunk;
@ -972,7 +975,7 @@ var _open = function(self, options, callback) {
self.length = 0;
} else {
self.length = 0;
var txtId = self.fileId instanceof ObjectID ? self.fileId.toHexString() : self.fileId;
var txtId = self.fileId._bsontype == "ObjectID" ? self.fileId.toHexString() : self.fileId;
return error(MongoError.create({message: f("file with id %s not opened for writing", (self.referenceBy == REFERENCE_BY_ID ? txtId : self.filename)), driver:true}), self);
}

View file

@ -4,13 +4,17 @@ var parse = require('./url_parser')
, Server = require('./server')
, Mongos = require('./mongos')
, ReplSet = require('./replset')
, EventEmitter = require('events').EventEmitter
, inherits = require('util').inherits
, Define = require('./metadata')
, ReadPreference = require('./read_preference')
, Logger = require('mongodb-core').Logger
, MongoError = require('mongodb-core').MongoError
, Db = require('./db')
, f = require('util').format
, shallowClone = require('./utils').shallowClone;
, assign = require('./utils').assign
, shallowClone = require('./utils').shallowClone
, authenticate = require('./authenticate');
/**
* @fileOverview The **MongoClient** class is a class that allows for making Connections to MongoDB.
@ -26,6 +30,38 @@ var parse = require('./url_parser')
* db.close();
* });
*/
var validOptionNames = ['poolSize', 'ssl', 'sslValidate', 'sslCA', 'sslCert', 'ciphers', 'ecdhCurve',
'sslKey', 'sslPass', 'sslCRL', 'autoReconnect', 'noDelay', 'keepAlive', 'connectTimeoutMS', 'family',
'socketTimeoutMS', 'reconnectTries', 'reconnectInterval', 'ha', 'haInterval',
'replicaSet', 'secondaryAcceptableLatencyMS', 'acceptableLatencyMS',
'connectWithNoPrimary', 'authSource', 'w', 'wtimeout', 'j', 'forceServerObjectId',
'serializeFunctions', 'ignoreUndefined', 'raw', 'bufferMaxEntries',
'readPreference', 'pkFactory', 'promiseLibrary', 'readConcern', 'maxStalenessSeconds',
'loggerLevel', 'logger', 'promoteValues', 'promoteBuffers', 'promoteLongs',
'domainsEnabled', 'keepAliveInitialDelay', 'checkServerIdentity', 'validateOptions', 'appname', 'auth'];
var ignoreOptionNames = ['native_parser'];
var legacyOptionNames = ['server', 'replset', 'replSet', 'mongos', 'db'];
function validOptions(options) {
var _validOptions = validOptionNames.concat(legacyOptionNames);
for(var name in options) {
if(ignoreOptionNames.indexOf(name) != -1) {
continue;
}
if(_validOptions.indexOf(name) == -1 && options.validateOptions) {
return new MongoError(f('option %s is not supported', name));
} else if(_validOptions.indexOf(name) == -1) {
console.warn(f('the options [%s] is not supported', name));
}
if(legacyOptionNames.indexOf(name) != -1) {
console.warn(f('the server/replset/mongos options are deprecated, '
+ 'all their options are supported at the top level of the options object [%s]', validOptionNames));
}
}
}
/**
* Creates a new MongoClient instance
@ -33,6 +69,11 @@ var parse = require('./url_parser')
* @return {MongoClient} a MongoClient instance.
*/
function MongoClient() {
if(!(this instanceof MongoClient)) return new MongoClient();
// Set up event emitter
EventEmitter.call(this);
/**
* The callback format for results
* @callback MongoClient~connectCallback
@ -49,19 +90,65 @@ function MongoClient() {
*
* @method
* @param {string} url The connection URI string
* @param {object} [options=null] Optional settings.
* @param {boolean} [options.uri_decode_auth=false] Uri decode the user name and password for authentication
* @param {object} [options.db=null] A hash of options to set on the db object, see **Db constructor**
* @param {object} [options.server=null] A hash of options to set on the server objects, see **Server** constructor**
* @param {object} [options.replSet=null] A hash of options to set on the replSet object, see **ReplSet** constructor**
* @param {object} [options.mongos=null] A hash of options to set on the mongos object, see **Mongos** constructor**
* @param {object} [options] Optional settings.
* @param {number} [options.poolSize=5] poolSize The maximum size of the individual server pool.
* @param {boolean} [options.ssl=false] Enable SSL connection.
* @param {Buffer} [options.sslCA=undefined] SSL Certificate store binary buffer
* @param {Buffer} [options.sslCRL=undefined] SSL Certificate revocation list binary buffer
* @param {Buffer} [options.sslCert=undefined] SSL Certificate binary buffer
* @param {Buffer} [options.sslKey=undefined] SSL Key file binary buffer
* @param {string} [options.sslPass=undefined] SSL Certificate pass phrase
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {boolean} [options.autoReconnect=true] Enable autoReconnect for single server instances
* @param {boolean} [options.noDelay=true] TCP Connection no delay
* @param {number} [options.family=4] Version of IP stack. Defaults to 4.
* @param {number} [options.keepAlive=30000] The number of milliseconds to wait before initiating keepAlive on the TCP socket.
* @param {number} [options.connectTimeoutMS=30000] TCP Connection timeout setting
* @param {number} [options.socketTimeoutMS=360000] TCP Socket timeout setting
* @param {number} [options.reconnectTries=30] Server attempt to reconnect #times
* @param {number} [options.reconnectInterval=1000] Server will wait # milliseconds between retries
* @param {boolean} [options.ha=true] Control if high availability monitoring runs for Replicaset or Mongos proxies.
* @param {number} [options.haInterval=10000] The High availability period for replicaset inquiry
* @param {string} [options.replicaSet=undefined] The Replicaset set name
* @param {number} [options.secondaryAcceptableLatencyMS=15] Cutoff latency point in MS for Replicaset member selection
* @param {number} [options.acceptableLatencyMS=15] Cutoff latency point in MS for Mongos proxies selection.
* @param {boolean} [options.connectWithNoPrimary=false] Sets if the driver should connect even if no primary is available
* @param {string} [options.authSource=undefined] Define the database to authenticate against
* @param {string} [options.auth.user=undefined] The username for auth
* @param {string} [options.auth.password=undefined] The password for auth
* @param {(number|string)} [options.w=null] The write concern.
* @param {number} [options.wtimeout=null] The write concern timeout.
* @param {boolean} [options.j=false] Specify a journal write concern.
* @param {boolean} [options.forceServerObjectId=false] Force server to assign _id values instead of driver.
* @param {boolean} [options.serializeFunctions=false] Serialize functions on any object.
* @param {Boolean} [options.ignoreUndefined=false] Specify if the BSON serializer should ignore undefined fields.
* @param {boolean} [options.raw=false] Return document results as raw BSON buffers.
* @param {boolean} [options.promoteLongs=true] Promotes Long values to number if they fit inside the 53 bits resolution.
* @param {boolean} [options.promoteBuffers=false] Promotes Binary BSON values to native Node Buffers.
* @param {boolean} [options.promoteValues=true] Promotes BSON values to native types where possible, set to false to only receive wrapper types.
* @param {number} [options.bufferMaxEntries=-1] Sets a cap on how many operations the driver will buffer up before giving up on getting a working connection, default is -1 which is unlimited.
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
* @param {boolean} [options.domainsEnabled=false] Enable the wrapping of the callback in the current domain, disabled by default to avoid perf hit.
* @param {object} [options.pkFactory=null] A primary key factory object for generation of custom _id keys.
* @param {object} [options.promiseLibrary=null] A Promise library class the application wishes to use such as Bluebird, must be ES6 compatible
* @param {object} [options.readConcern=null] Specify a read concern for the collection. (only MongoDB 3.2 or higher supported)
* @param {string} [options.readConcern.level='local'] Specify a read concern level for the collection operations, one of [local|majority]. (only MongoDB 3.2 or higher supported)
* @param {number} [options.maxStalenessSeconds=undefined] The max staleness to secondary reads (values under 10 seconds cannot be guaranteed);
* @param {string} [options.appname=undefined] The name of the application that created this MongoClient instance. MongoDB 3.4 and newer will print this value in the server log upon establishing each connection. It is also recorded in the slow query log and profile collections.
* @param {string} [options.loggerLevel=undefined] The logging level (error/warn/info/debug)
* @param {object} [options.logger=undefined] Custom logger object
* @param {object} [options.validateOptions=false] Validate MongoClient passed in options for correctness.
* @param {MongoClient~connectCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
*/
this.connect = MongoClient.connect;
}
/**
* @ignore
*/
inherits(MongoClient, EventEmitter);
var define = MongoClient.define = new Define('MongoClient', MongoClient, false);
/**
@ -74,13 +161,54 @@ var define = MongoClient.define = new Define('MongoClient', MongoClient, false);
* @method
* @static
* @param {string} url The connection URI string
* @param {object} [options=null] Optional settings.
* @param {boolean} [options.uri_decode_auth=false] Uri decode the user name and password for authentication
* @param {object} [options.db=null] A hash of options to set on the db object, see **Db constructor**
* @param {object} [options.server=null] A hash of options to set on the server objects, see **Server** constructor**
* @param {object} [options.replSet=null] A hash of options to set on the replSet object, see **ReplSet** constructor**
* @param {object} [options.mongos=null] A hash of options to set on the mongos object, see **Mongos** constructor**
* @param {object} [options] Optional settings.
* @param {number} [options.poolSize=5] poolSize The maximum size of the individual server pool.
* @param {boolean} [options.ssl=false] Enable SSL connection.
* @param {Buffer} [options.sslCA=undefined] SSL Certificate store binary buffer
* @param {Buffer} [options.sslCRL=undefined] SSL Certificate revocation list binary buffer
* @param {Buffer} [options.sslCert=undefined] SSL Certificate binary buffer
* @param {Buffer} [options.sslKey=undefined] SSL Key file binary buffer
* @param {string} [options.sslPass=undefined] SSL Certificate pass phrase
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {boolean} [options.autoReconnect=true] Enable autoReconnect for single server instances
* @param {boolean} [options.noDelay=true] TCP Connection no delay
* @param {number} [options.family=4] Version of IP stack. Defaults to 4.
* @param {boolean} [options.keepAlive=30000] The number of milliseconds to wait before initiating keepAlive on the TCP socket.
* @param {number} [options.connectTimeoutMS=30000] TCP Connection timeout setting
* @param {number} [options.socketTimeoutMS=360000] TCP Socket timeout setting
* @param {number} [options.reconnectTries=30] Server attempt to reconnect #times
* @param {number} [options.reconnectInterval=1000] Server will wait # milliseconds between retries
* @param {boolean} [options.ha=true] Control if high availability monitoring runs for Replicaset or Mongos proxies.
* @param {number} [options.haInterval=10000] The High availability period for replicaset inquiry
* @param {string} [options.replicaSet=undefined] The Replicaset set name
* @param {number} [options.secondaryAcceptableLatencyMS=15] Cutoff latency point in MS for Replicaset member selection
* @param {number} [options.acceptableLatencyMS=15] Cutoff latency point in MS for Mongos proxies selection.
* @param {boolean} [options.connectWithNoPrimary=false] Sets if the driver should connect even if no primary is available
* @param {string} [options.authSource=undefined] Define the database to authenticate against
* @param {string} [options.auth.user=undefined] The username for auth
* @param {string} [options.auth.password=undefined] The password for auth
* @param {(number|string)} [options.w=null] The write concern.
* @param {number} [options.wtimeout=null] The write concern timeout.
* @param {boolean} [options.j=false] Specify a journal write concern.
* @param {boolean} [options.forceServerObjectId=false] Force server to assign _id values instead of driver.
* @param {boolean} [options.serializeFunctions=false] Serialize functions on any object.
* @param {Boolean} [options.ignoreUndefined=false] Specify if the BSON serializer should ignore undefined fields.
* @param {boolean} [options.raw=false] Return document results as raw BSON buffers.
* @param {boolean} [options.promoteLongs=true] Promotes Long values to number if they fit inside the 53 bits resolution.
* @param {boolean} [options.promoteBuffers=false] Promotes Binary BSON values to native Node Buffers.
* @param {boolean} [options.promoteValues=true] Promotes BSON values to native types where possible, set to false to only receive wrapper types.
* @param {number} [options.bufferMaxEntries=-1] Sets a cap on how many operations the driver will buffer up before giving up on getting a working connection, default is -1 which is unlimited.
* @param {(ReadPreference|string)} [options.readPreference=null] The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST).
* @param {boolean} [options.domainsEnabled=false] Enable the wrapping of the callback in the current domain, disabled by default to avoid perf hit.
* @param {object} [options.pkFactory=null] A primary key factory object for generation of custom _id keys.
* @param {object} [options.promiseLibrary=null] A Promise library class the application wishes to use such as Bluebird, must be ES6 compatible
* @param {object} [options.readConcern=null] Specify a read concern for the collection. (only MongoDB 3.2 or higher supported)
* @param {string} [options.readConcern.level='local'] Specify a read concern level for the collection operations, one of [local|majority]. (only MongoDB 3.2 or higher supported)
* @param {number} [options.maxStalenessSeconds=undefined] The max staleness to secondary reads (values under 10 seconds cannot be guaranteed);
* @param {string} [options.appname=undefined] The name of the application that created this MongoClient instance. MongoDB 3.4 and newer will print this value in the server log upon establishing each connection. It is also recorded in the slow query log and profile collections.
* @param {string} [options.loggerLevel=undefined] The logging level (error/warn/info/debug)
* @param {object} [options.logger=undefined] Custom logger object
* @param {object} [options.validateOptions=false] Validate MongoClient passed in options for correctness.
* @param {MongoClient~connectCallback} [callback] The command result callback
* @return {Promise} returns Promise if no callback passed
*/
@ -89,6 +217,10 @@ MongoClient.connect = function(url, options, callback) {
callback = typeof args[args.length - 1] == 'function' ? args.pop() : null;
options = args.length ? args.shift() : null;
options = options || {};
var self = this;
// Validate options object
var err = validOptions(options);
// Get the promiseLibrary
var promiseLibrary = options.promiseLibrary;
@ -102,15 +234,20 @@ MongoClient.connect = function(url, options, callback) {
// Return a promise
if(typeof callback != 'function') {
return new promiseLibrary(function(resolve, reject) {
connect(url, options, function(err, db) {
// Did we have a validation error
if(err) return reject(err);
// Attempt to connect
connect(self, url, options, function(err, db) {
if(err) return reject(err);
resolve(db);
});
});
}
// Did we have a validation error
if(err) return callback(err);
// Fallback to callback based connect
connect(url, options, callback);
connect(self, url, options, callback);
}
define.staticMethod('connect', {callback: true, promise:true});
@ -130,7 +267,7 @@ var mergeOptions = function(target, source, flatten) {
var createUnifiedOptions = function(finalOptions, options) {
var childOptions = ['mongos', 'server', 'db'
, 'replset', 'db_options', 'server_options', 'rs_options', 'mongos_options'];
var noMerge = [];
var noMerge = ['readconcern'];
for(var name in options) {
if(noMerge.indexOf(name.toLowerCase()) != -1) {
@ -166,7 +303,7 @@ function translateOptions(options) {
}
// Set the socket and connection timeouts
if(options.socketTimeoutMS == null) options.socketTimeoutMS = 30000;
if(options.socketTimeoutMS == null) options.socketTimeoutMS = 360000;
if(options.connectTimeoutMS == null) options.connectTimeoutMS = 30000;
// Create server instances
@ -177,25 +314,81 @@ function translateOptions(options) {
});
}
function createReplicaset(options, callback) {
// Set default options
var servers = translateOptions(options);
// Create Db instance
new Db(options.dbName, new ReplSet(servers, options), options).open(callback);
//
// Collect all events in order from SDAM
//
function collectEvents(self, db) {
var collectedEvents = [];
if(self instanceof MongoClient) {
var events = ["timeout", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged', 'joined', 'left', 'ping', 'ha', 'all', 'fullsetup'];
events.forEach(function(event) {
db.serverConfig.on(event, function(object1, object2) {
collectedEvents.push({
event: event, object1: object1, object2: object2
});
});
});
}
return collectedEvents;
}
function createMongos(options, callback) {
// Set default options
var servers = translateOptions(options);
// Create Db instance
new Db(options.dbName, new Mongos(servers, options), options).open(callback);
//
// Replay any events due to single server connection switching to Mongos
//
function replayEvents(self, events) {
for(var i = 0; i < events.length; i++) {
self.emit(events[i].event, events[i].object1, events[i].object2);
}
}
function createServer(options, callback) {
function relayEvents(self, db) {
if(self instanceof MongoClient) {
var events = ["timeout", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged', 'joined', 'left', 'ping', 'ha', 'all', 'fullsetup'];
events.forEach(function(event) {
db.serverConfig.on(event, function(object1, object2) {
self.emit(event, object1, object2);
});
});
}
}
function createReplicaset(self, options, callback) {
// Set default options
var servers = translateOptions(options);
// Create Db instance
new Db(options.dbName, servers[0], options).open(function(err, db) {
var db = new Db(options.dbName, new ReplSet(servers, options), options);
// Propegate the events to the client
relayEvents(self, db);
// Open the connection
db.open(callback);
}
function createMongos(self, options, callback) {
// Set default options
var servers = translateOptions(options);
// Create Db instance
var db = new Db(options.dbName, new Mongos(servers, options), options)
// Propegate the events to the client
relayEvents(self, db);
// Open the connection
db.open(callback);
}
function createServer(self, options, callback) {
// Set default options
var servers = translateOptions(options);
// Create db instance
var db = new Db(options.dbName, servers[0], options);
// Propegate the events to the client
var collectedEvents = collectEvents(self, db);
// Create Db instance
db.open(function(err, db) {
if(err) return callback(err);
// Check if we are really speaking to a mongos
var ismaster = db.serverConfig.lastIsMaster();
@ -205,9 +398,13 @@ function createServer(options, callback) {
// Destroy the current connection
db.close();
// Create mongos connection instead
return createMongos(options, callback);
return createMongos(self, options, callback);
}
// Fire all the events
replayEvents(self, collectedEvents);
// Propegate the events to the client
relayEvents(self, db);
// Otherwise callback
callback(err, db);
});
@ -245,7 +442,7 @@ function connectHandler(options, callback) {
}
// Authenticate
authentication_db.authenticate(options.user, options.password, options, function(err, success){
authenticate(authentication_db, options.user, options.password, options, function(err, success) {
if(success){
process.nextTick(function() {
try {
@ -273,7 +470,7 @@ function connectHandler(options, callback) {
/*
* Connect using MongoClient
*/
var connect = function(url, options, callback) {
var connect = function(self, url, options, callback) {
options = options || {};
options = shallowClone(options);
@ -285,20 +482,36 @@ var connect = function(url, options, callback) {
// Get a logger for MongoClient
var logger = Logger('MongoClient', options);
// Parse the string
var object = parse(url, options);
var _finalOptions = createUnifiedOptions({}, object);
_finalOptions = mergeOptions(_finalOptions, object, false);
_finalOptions = createUnifiedOptions(_finalOptions, options);
parse(url, options, function(err, object) {
if (err) return callback(err);
// Check if we have connection and socket timeout set
if(_finalOptions.socketTimeoutMS == null) _finalOptions.socketTimeoutMS = 30000;
if(_finalOptions.connectTimeoutMS == null) _finalOptions.connectTimeoutMS = 30000;
// Parse the string
var _finalOptions = createUnifiedOptions({}, object);
_finalOptions = mergeOptions(_finalOptions, object, false);
_finalOptions = createUnifiedOptions(_finalOptions, options);
// Failure modes
if(object.servers.length == 0) {
throw new Error("connection string must contain at least one seed host");
}
// Check if we have connection and socket timeout set
if(_finalOptions.socketTimeoutMS == null) _finalOptions.socketTimeoutMS = 360000;
if(_finalOptions.connectTimeoutMS == null) _finalOptions.connectTimeoutMS = 30000;
if (_finalOptions.db_options && _finalOptions.db_options.auth) {
delete _finalOptions.db_options.auth;
}
// Failure modes
if(object.servers.length == 0) {
throw new Error("connection string must contain at least one seed host");
}
// Do we have a replicaset then skip discovery and go straight to connectivity
if(_finalOptions.replicaSet || _finalOptions.rs_name) {
return createReplicaset(self, _finalOptions, connectHandler(_finalOptions, connectCallback));
} else if(object.servers.length > 1) {
return createMongos(self, _finalOptions, connectHandler(_finalOptions, connectCallback));
} else {
return createServer(self, _finalOptions, connectHandler(_finalOptions, connectCallback));
}
});
function connectCallback(err, db) {
if(err && err.message == 'no mongos proxies found in seed list') {
@ -313,15 +526,6 @@ var connect = function(url, options, callback) {
// Return the error and db instance
callback(err, db);
}
// Do we have a replicaset then skip discovery and go straight to connectivity
if(_finalOptions.replicaSet || _finalOptions.rs_name) {
return createReplicaset(_finalOptions, connectHandler(_finalOptions, connectCallback));
} else if(object.servers.length > 1) {
return createMongos(_finalOptions, connectHandler(_finalOptions, connectCallback));
} else {
return createServer(_finalOptions, connectHandler(_finalOptions, connectCallback));
}
}
module.exports = MongoClient

View file

@ -16,10 +16,11 @@ var EventEmitter = require('events').EventEmitter
, translateOptions = require('./utils').translateOptions
, filterOptions = require('./utils').filterOptions
, mergeOptions = require('./utils').mergeOptions
, getReadPreference = require('./utils').getReadPreference
, os = require('os');
// Get package.json variable
var driverVersion = require(__dirname + '/../package.json').version;
var driverVersion = require('../package.json').version;
var nodejsversion = f('Node.js %s, %s', process.version, os.endianness());
var type = os.type();
var name = process.platform;
@ -47,8 +48,8 @@ var release = os.release();
// Allowed parameters
var legalOptionNames = ['ha', 'haInterval', 'acceptableLatencyMS'
, 'poolSize', 'ssl', 'checkServerIdentity', 'sslValidate'
, 'sslCA', 'sslCert', 'sslKey', 'sslPass', 'socketOptions', 'bufferMaxEntries'
, 'poolSize', 'ssl', 'checkServerIdentity', 'sslValidate', 'ciphers', 'ecdhCurve'
, 'sslCA', 'sslCRL', 'sslCert', 'sslKey', 'sslPass', 'socketOptions', 'bufferMaxEntries'
, 'store', 'auto_reconnect', 'autoReconnect', 'emitError'
, 'keepAlive', 'noDelay', 'connectTimeoutMS', 'socketTimeoutMS'
, 'loggerLevel', 'logger', 'reconnectTries', 'appname', 'domainsEnabled'
@ -68,6 +69,7 @@ var release = os.release();
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {object} [options.sslValidate=true] Validate mongod server certificate against ca (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {array} [options.sslCA=null] Array of valid certificates either as Buffers or Strings (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {array} [options.sslCRL=null] Array of revocation certificates either as Buffers or Strings (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslCert=null] String or buffer containing the certificate we wish to present (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslKey=null] String or buffer containing the certificate private key we wish to present (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslPass=null] String or buffer containing the certificate password (needs to have a mongod server with ssl support, 2.4 or higher)
@ -88,6 +90,7 @@ var release = os.release();
* @fires Mongos#error
* @fires Mongos#timeout
* @fires Mongos#parseError
* @property {string} parserType the parser type used (c++ or js).
* @return {Mongos} a Mongos instance.
*/
var Mongos = function(servers, options) {
@ -207,6 +210,12 @@ Object.defineProperty(Mongos.prototype, 'isMasterDoc', {
enumerable:true, get: function() { return this.s.mongos.lastIsMaster(); }
});
Object.defineProperty(Mongos.prototype, 'parserType', {
enumerable:true, get: function() {
return this.s.mongos.parserType;
}
});
// BSON property
Object.defineProperty(Mongos.prototype, 'bson', {
enumerable: true, get: function() {
@ -274,9 +283,8 @@ Mongos.prototype.connect = function(db, _options, callback) {
// Connect handler
var connectHandler = function() {
// Clear out all the current handlers left over
["timeout", "error", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged'].forEach(function(e) {
var events = ["timeout", "error", "close", 'fullsetup'];
events.forEach(function(e) {
self.s.mongos.removeAllListeners(e);
});
@ -285,19 +293,8 @@ Mongos.prototype.connect = function(db, _options, callback) {
self.s.mongos.once('error', errorHandler('error'));
self.s.mongos.once('close', errorHandler('close'));
// Set up SDAM listeners
self.s.mongos.on('serverDescriptionChanged', relay('serverDescriptionChanged'));
self.s.mongos.on('serverHeartbeatStarted', relay('serverHeartbeatStarted'));
self.s.mongos.on('serverHeartbeatSucceeded', relay('serverHeartbeatSucceeded'));
self.s.mongos.on('serverHeartbeatFailed', relay('serverHeartbeatFailed'));
self.s.mongos.on('serverOpening', relay('serverOpening'));
self.s.mongos.on('serverClosed', relay('serverClosed'));
self.s.mongos.on('topologyOpening', relay('topologyOpening'));
self.s.mongos.on('topologyClosed', relay('topologyClosed'));
self.s.mongos.on('topologyDescriptionChanged', relay('topologyDescriptionChanged'));
// Set up serverConfig listeners
self.s.mongos.on('fullsetup', relay('fullsetup'));
self.s.mongos.on('fullsetup', function() { self.emit('fullsetup', self); });
// Emit open event
self.emit('open', null, self);
@ -310,6 +307,25 @@ Mongos.prototype.connect = function(db, _options, callback) {
}
}
// Clear out all the current handlers left over
var events = ["timeout", "error", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged'];
events.forEach(function(e) {
self.s.mongos.removeAllListeners(e);
});
// Set up SDAM listeners
self.s.mongos.on('serverDescriptionChanged', relay('serverDescriptionChanged'));
self.s.mongos.on('serverHeartbeatStarted', relay('serverHeartbeatStarted'));
self.s.mongos.on('serverHeartbeatSucceeded', relay('serverHeartbeatSucceeded'));
self.s.mongos.on('serverHeartbeatFailed', relay('serverHeartbeatFailed'));
self.s.mongos.on('serverOpening', relay('serverOpening'));
self.s.mongos.on('serverClosed', relay('serverClosed'));
self.s.mongos.on('topologyOpening', relay('topologyOpening'));
self.s.mongos.on('topologyClosed', relay('topologyClosed'));
self.s.mongos.on('topologyDescriptionChanged', relay('topologyDescriptionChanged'));
// Set up listeners
self.s.mongos.once('timeout', connectErrorHandler('timeout'));
self.s.mongos.once('error', connectErrorHandler('error'));
@ -338,7 +354,7 @@ define.classMethod('capabilities', {callback: false, promise:false, returns: [Se
// Command
Mongos.prototype.command = function(ns, cmd, options, callback) {
this.s.mongos.command(ns, cmd, options, callback);
this.s.mongos.command(ns, cmd, getReadPreference(options), callback);
}
define.classMethod('command', {callback: true, promise:false});
@ -390,8 +406,18 @@ Mongos.prototype.lastIsMaster = function() {
return this.s.mongos.lastIsMaster();
}
/**
* Unref all sockets
* @method
*/
Mongos.prototype.unref = function () {
return this.s.mongos.unref();
}
Mongos.prototype.close = function(forceClosed) {
this.s.mongos.destroy();
this.s.mongos.destroy({
force: typeof forceClosed == 'boolean' ? forceClosed : false,
});
// We need to wash out all stored processes
if(forceClosed == true) {
this.s.storeOptions.force = forceClosed;

View file

@ -39,7 +39,7 @@
* @param {string} mode The ReadPreference mode as listed above.
* @param {array|object} tags An object representing read preference tags.
* @param {object} [options] Additional read preference options
* @param {number} [options.maxStalenessSeconds] Max Secondary Read Stalleness in Seconds
* @param {number} [options.maxStalenessSeconds] Max Secondary Read Staleness in Seconds
* @return {ReadPreference} a ReadPreference instance.
*/
var ReadPreference = function(mode, tags, options) {

View file

@ -17,6 +17,7 @@ var EventEmitter = require('events').EventEmitter
, MAX_JS_INT = require('./utils').MAX_JS_INT
, translateOptions = require('./utils').translateOptions
, filterOptions = require('./utils').filterOptions
, getReadPreference = require('./utils').getReadPreference
, mergeOptions = require('./utils').mergeOptions
, os = require('os');
/**
@ -41,14 +42,14 @@ var EventEmitter = require('events').EventEmitter
// Allowed parameters
var legalOptionNames = ['ha', 'haInterval', 'replicaSet', 'rs_name', 'secondaryAcceptableLatencyMS'
, 'connectWithNoPrimary', 'poolSize', 'ssl', 'checkServerIdentity', 'sslValidate'
, 'sslCA', 'sslCert', 'sslKey', 'sslPass', 'socketOptions', 'bufferMaxEntries'
, 'sslCA', 'sslCert', 'sslCRL', 'sslKey', 'sslPass', 'socketOptions', 'bufferMaxEntries'
, 'store', 'auto_reconnect', 'autoReconnect', 'emitError'
, 'keepAlive', 'noDelay', 'connectTimeoutMS', 'socketTimeoutMS', 'strategy', 'debug'
, 'keepAlive', 'noDelay', 'connectTimeoutMS', 'socketTimeoutMS', 'strategy', 'debug', 'family'
, 'loggerLevel', 'logger', 'reconnectTries', 'appname', 'domainsEnabled'
, 'servername', 'promoteLongs', 'promoteValues', 'promoteBuffers'];
, 'servername', 'promoteLongs', 'promoteValues', 'promoteBuffers', 'maxStalenessSeconds'];
// Get package.json variable
var driverVersion = require(__dirname + '/../package.json').version;
var driverVersion = require('../package.json').version;
var nodejsversion = f('Node.js %s, %s', process.version, os.endianness());
var type = os.type();
var name = process.platform;
@ -61,7 +62,7 @@ var release = os.release();
* @deprecated
* @param {Server[]} servers A seedlist of servers participating in the replicaset.
* @param {object} [options=null] Optional settings.
* @param {booelan} [options.ha=true] Turn on high availability monitoring.
* @param {boolean} [options.ha=true] Turn on high availability monitoring.
* @param {number} [options.haInterval=10000] Time between each replicaset status check.
* @param {string} [options.replicaSet] The name of the replicaset to connect to.
* @param {number} [options.secondaryAcceptableLatencyMS=15] Sets the range of servers to pick when using NEAREST (lowest ping ms + the latency fence, ex: range of 1 to (1 + 15) ms)
@ -71,6 +72,7 @@ var release = os.release();
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {object} [options.sslValidate=true] Validate mongod server certificate against ca (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {array} [options.sslCA=null] Array of valid certificates either as Buffers or Strings (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {array} [options.sslCRL=null] Array of revocation certificates either as Buffers or Strings (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslCert=null] String or buffer containing the certificate we wish to present (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslKey=null] String or buffer containing the certificate private key we wish to present (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslPass=null] String or buffer containing the certificate password (needs to have a mongod server with ssl support, 2.4 or higher)
@ -81,6 +83,7 @@ var release = os.release();
* @param {number} [options.socketOptions.connectTimeoutMS=10000] TCP Connection timeout setting
* @param {number} [options.socketOptions.socketTimeoutMS=0] TCP Socket timeout setting
* @param {boolean} [options.domainsEnabled=false] Enable the wrapping of the callback in the current domain, disabled by default to avoid perf hit.
* @param {number} [options.maxStalenessSeconds=undefined] The max staleness to secondary reads (values under 10 seconds cannot be guaranteed);
* @fires ReplSet#connect
* @fires ReplSet#ha
* @fires ReplSet#joined
@ -91,6 +94,7 @@ var release = os.release();
* @fires ReplSet#error
* @fires ReplSet#timeout
* @fires ReplSet#parseError
* @property {string} parserType the parser type used (c++ or js).
* @return {ReplSet} a ReplSet instance.
*/
var ReplSet = function(servers, options) {
@ -215,6 +219,12 @@ Object.defineProperty(ReplSet.prototype, 'isMasterDoc', {
enumerable:true, get: function() { return this.s.replset.lastIsMaster(); }
});
Object.defineProperty(ReplSet.prototype, 'parserType', {
enumerable:true, get: function() {
return this.s.replset.parserType;
}
});
// BSON property
Object.defineProperty(ReplSet.prototype, 'bson', {
enumerable: true, get: function() {
@ -260,70 +270,71 @@ ReplSet.prototype.connect = function(db, _options, callback) {
}
}
// Clear out all the current handlers left over
var events = ["timeout", "error", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged', 'joined', 'left', 'ping', 'ha'];
events.forEach(function(e) {
self.s.replset.removeAllListeners(e);
});
// relay the event
var relay = function(event) {
return function(t, server) {
self.emit(event, t, server);
}
}
// Replset events relay
var replsetRelay = function(event) {
return function(t, server) {
self.emit(event, t, server.lastIsMaster(), server);
}
}
// Relay ha
var relayHa = function(t, state) {
self.emit('ha', t, state);
if(t == 'start') {
self.emit('ha_connect', t, state);
} else if(t == 'end') {
self.emit('ha_ismaster', t, state);
}
}
// Set up serverConfig listeners
self.s.replset.on('joined', replsetRelay('joined'));
self.s.replset.on('left', relay('left'));
self.s.replset.on('ping', relay('ping'));
self.s.replset.on('ha', relayHa);
// Set up SDAM listeners
self.s.replset.on('serverDescriptionChanged', relay('serverDescriptionChanged'));
self.s.replset.on('serverHeartbeatStarted', relay('serverHeartbeatStarted'));
self.s.replset.on('serverHeartbeatSucceeded', relay('serverHeartbeatSucceeded'));
self.s.replset.on('serverHeartbeatFailed', relay('serverHeartbeatFailed'));
self.s.replset.on('serverOpening', relay('serverOpening'));
self.s.replset.on('serverClosed', relay('serverClosed'));
self.s.replset.on('topologyOpening', relay('topologyOpening'));
self.s.replset.on('topologyClosed', relay('topologyClosed'));
self.s.replset.on('topologyDescriptionChanged', relay('topologyDescriptionChanged'));
self.s.replset.on('fullsetup', function() {
self.emit('fullsetup', self, self);
});
self.s.replset.on('all', function() {
self.emit('all', null, self);
});
// Connect handler
var connectHandler = function() {
// Clear out all the current handlers left over
["timeout", "error", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged'].forEach(function(e) {
self.s.replset.removeAllListeners(e);
});
// Set up listeners
self.s.replset.once('timeout', errorHandler('timeout'));
self.s.replset.once('error', errorHandler('error'));
self.s.replset.once('close', errorHandler('close'));
// relay the event
var relay = function(event) {
return function(t, server) {
self.emit(event, t, server);
}
}
// Replset events relay
var replsetRelay = function(event) {
return function(t, server) {
self.emit(event, t, server.lastIsMaster(), server);
}
}
// Relay ha
var relayHa = function(t, state) {
self.emit('ha', t, state);
if(t == 'start') {
self.emit('ha_connect', t, state);
} else if(t == 'end') {
self.emit('ha_ismaster', t, state);
}
}
// Set up serverConfig listeners
self.s.replset.on('joined', replsetRelay('joined'));
self.s.replset.on('left', relay('left'));
self.s.replset.on('ping', relay('ping'));
self.s.replset.on('ha', relayHa);
// Set up SDAM listeners
self.s.replset.on('serverDescriptionChanged', relay('serverDescriptionChanged'));
self.s.replset.on('serverHeartbeatStarted', relay('serverHeartbeatStarted'));
self.s.replset.on('serverHeartbeatSucceeded', relay('serverHeartbeatSucceeded'));
self.s.replset.on('serverHeartbeatFailed', relay('serverHeartbeatFailed'));
self.s.replset.on('serverOpening', relay('serverOpening'));
self.s.replset.on('serverClosed', relay('serverClosed'));
self.s.replset.on('topologyOpening', relay('topologyOpening'));
self.s.replset.on('topologyClosed', relay('topologyClosed'));
self.s.replset.on('topologyDescriptionChanged', relay('topologyDescriptionChanged'));
self.s.replset.on('fullsetup', function() {
self.emit('fullsetup', null, self);
});
self.s.replset.on('all', function() {
self.emit('all', null, self);
});
// Emit open event
self.emit('open', null, self);
@ -378,8 +389,7 @@ define.classMethod('capabilities', {callback: false, promise:false, returns: [Se
// Command
ReplSet.prototype.command = function(ns, cmd, options, callback) {
options = translateReadPreference(options);
this.s.replset.command(ns, cmd, options, callback);
this.s.replset.command(ns, cmd, getReadPreference(options), callback);
}
define.classMethod('command', {callback: true, promise:false});
@ -411,8 +421,16 @@ ReplSet.prototype.isDestroyed = function() {
}
// IsConnected
ReplSet.prototype.isConnected = function() {
return this.s.replset.isConnected();
ReplSet.prototype.isConnected = function(options) {
options = options || {};
// If we passed in a readPreference, translate to
// a CoreReadPreference instance
if(options.readPreference) {
options.readPreference = translateReadPreference(options.readPreference);
}
return this.s.replset.isConnected(options);
}
define.classMethod('isConnected', {callback: false, promise:false, returns: [Boolean]});
@ -430,9 +448,20 @@ ReplSet.prototype.lastIsMaster = function() {
return this.s.replset.lastIsMaster();
}
/**
* Unref all sockets
* @method
*/
ReplSet.prototype.unref = function() {
return this.s.replset.unref();
}
ReplSet.prototype.close = function(forceClosed) {
var self = this;
this.s.replset.destroy();
// Call destroy on the topology
this.s.replset.destroy({
force: typeof forceClosed == 'boolean' ? forceClosed : false,
});
// We need to wash out all stored processes
if(forceClosed == true) {
this.s.storeOptions.force = forceClosed;

View file

@ -15,10 +15,11 @@ var EventEmitter = require('events').EventEmitter
, translateOptions = require('./utils').translateOptions
, filterOptions = require('./utils').filterOptions
, mergeOptions = require('./utils').mergeOptions
, getReadPreference = require('./utils').getReadPreference
, os = require('os');
// Get package.json variable
var driverVersion = require(__dirname + '/../package.json').version;
var driverVersion = require('../package.json').version;
var nodejsversion = f('Node.js %s, %s', process.version, os.endianness());
var type = os.type();
var name = process.platform;
@ -44,10 +45,10 @@ var release = os.release();
// Allowed parameters
var legalOptionNames = ['ha', 'haInterval', 'acceptableLatencyMS'
, 'poolSize', 'ssl', 'checkServerIdentity', 'sslValidate'
, 'sslCA', 'sslCert', 'sslKey', 'sslPass', 'socketOptions', 'bufferMaxEntries'
, 'poolSize', 'ssl', 'checkServerIdentity', 'sslValidate', 'ciphers', 'ecdhCurve'
, 'sslCA', 'sslCRL', 'sslCert', 'sslKey', 'sslPass', 'socketOptions', 'bufferMaxEntries'
, 'store', 'auto_reconnect', 'autoReconnect', 'emitError'
, 'keepAlive', 'noDelay', 'connectTimeoutMS', 'socketTimeoutMS'
, 'keepAlive', 'noDelay', 'connectTimeoutMS', 'socketTimeoutMS', 'family'
, 'loggerLevel', 'logger', 'reconnectTries', 'reconnectInterval', 'monitoring'
, 'appname', 'domainsEnabled'
, 'servername', 'promoteLongs', 'promoteValues', 'promoteBuffers'];
@ -64,12 +65,13 @@ var release = os.release();
* @param {object} [options.sslValidate=true] Validate mongod server certificate against ca (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {boolean|function} [options.checkServerIdentity=true] Ensure we check server identify during SSL, set to false to disable checking. Only works for Node 0.12.x or higher. You can pass in a boolean or your own checkServerIdentity override function.
* @param {array} [options.sslCA=null] Array of valid certificates either as Buffers or Strings (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {array} [options.sslCRL=null] Array of revocation certificates either as Buffers or Strings (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslCert=null] String or buffer containing the certificate we wish to present (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslKey=null] String or buffer containing the certificate private key we wish to present (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {(Buffer|string)} [options.sslPass=null] String or buffer containing the certificate password (needs to have a mongod server with ssl support, 2.4 or higher)
* @param {string} [options.servername=null] String containing the server name requested via TLS SNI.
* @param {boolean} [options.autoReconnect=true] Reconnect on error or timeout.
* @param {object} [options.socketOptions=null] Socket options
* @param {boolean} [options.socketOptions.autoReconnect=true] Reconnect on error.
* @param {boolean} [options.socketOptions.noDelay=true] TCP Socket NoDelay option.
* @param {number} [options.socketOptions.keepAlive=0] TCP KeepAlive on the socket with a X ms delay before start.
* @param {number} [options.socketOptions.connectTimeoutMS=0] TCP Connection timeout setting
@ -85,6 +87,7 @@ var release = os.release();
* @fires Server#timeout
* @fires Server#parseError
* @fires Server#reconnect
* @property {string} parserType the parser type used (c++ or js).
* @return {Server} a Server instance.
*/
var Server = function(host, port, options) {
@ -212,6 +215,12 @@ Object.defineProperty(Server.prototype, 'isMasterDoc', {
}
});
Object.defineProperty(Server.prototype, 'parserType', {
enumerable:true, get: function() {
return this.s.server.parserType;
}
});
// Last ismaster
Object.defineProperty(Server.prototype, 'poolSize', {
enumerable:true, get: function() { return this.s.server.connections().length; }
@ -286,12 +295,17 @@ Server.prototype.connect = function(db, _options, callback) {
self.s.store.flush();
}
// relay the event
var relay = function(event) {
return function(t, server) {
self.emit(event, t, server);
}
}
// Connect handler
var connectHandler = function() {
// Clear out all the current handlers left over
["timeout", "error", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged'].forEach(function(e) {
["timeout", "error", "close", 'destroy'].forEach(function(e) {
self.s.server.removeAllListeners(e);
});
@ -302,26 +316,6 @@ Server.prototype.connect = function(db, _options, callback) {
// Only called on destroy
self.s.server.on('destroy', destroyHandler);
// relay the event
var relay = function(event) {
return function(t, server) {
self.emit(event, t, server);
}
}
// Set up SDAM listeners
self.s.server.on('serverDescriptionChanged', relay('serverDescriptionChanged'));
self.s.server.on('serverHeartbeatStarted', relay('serverHeartbeatStarted'));
self.s.server.on('serverHeartbeatSucceeded', relay('serverHeartbeatSucceeded'));
self.s.server.on('serverHeartbeatFailed', relay('serverHeartbeatFailed'));
self.s.server.on('serverOpening', relay('serverOpening'));
self.s.server.on('serverClosed', relay('serverClosed'));
self.s.server.on('topologyOpening', relay('topologyOpening'));
self.s.server.on('topologyClosed', relay('topologyClosed'));
self.s.server.on('topologyDescriptionChanged', relay('topologyDescriptionChanged'));
self.s.server.on('attemptReconnect', relay('attemptReconnect'));
self.s.server.on('monitoring', relay('monitoring'));
// Emit open event
self.emit('open', null, self);
@ -341,6 +335,13 @@ Server.prototype.connect = function(db, _options, callback) {
close: connectErrorHandler('close')
};
// Clear out all the current handlers left over
["timeout", "error", "close", 'serverOpening', 'serverDescriptionChanged', 'serverHeartbeatStarted',
'serverHeartbeatSucceeded', 'serverHeartbeatFailed', 'serverClosed', 'topologyOpening',
'topologyClosed', 'topologyDescriptionChanged'].forEach(function(e) {
self.s.server.removeAllListeners(e);
});
// Add the event handlers
self.s.server.once('timeout', connectHandlers.timeout);
self.s.server.once('error', connectHandlers.error);
@ -350,6 +351,19 @@ Server.prototype.connect = function(db, _options, callback) {
self.s.server.on('reconnect', reconnectHandler);
self.s.server.on('reconnectFailed', reconnectFailedHandler);
// Set up SDAM listeners
self.s.server.on('serverDescriptionChanged', relay('serverDescriptionChanged'));
self.s.server.on('serverHeartbeatStarted', relay('serverHeartbeatStarted'));
self.s.server.on('serverHeartbeatSucceeded', relay('serverHeartbeatSucceeded'));
self.s.server.on('serverHeartbeatFailed', relay('serverHeartbeatFailed'));
self.s.server.on('serverOpening', relay('serverOpening'));
self.s.server.on('serverClosed', relay('serverClosed'));
self.s.server.on('topologyOpening', relay('topologyOpening'));
self.s.server.on('topologyClosed', relay('topologyClosed'));
self.s.server.on('topologyDescriptionChanged', relay('topologyDescriptionChanged'));
self.s.server.on('attemptReconnect', relay('attemptReconnect'));
self.s.server.on('monitoring', relay('monitoring'));
// Start connection
self.s.server.connect(_options);
}
@ -366,7 +380,7 @@ define.classMethod('capabilities', {callback: false, promise:false, returns: [Se
// Command
Server.prototype.command = function(ns, cmd, options, callback) {
this.s.server.command(ns, cmd, options, callback);
this.s.server.command(ns, cmd, getReadPreference(options), callback);
}
define.classMethod('command', {callback: true, promise:false});

View file

@ -2,9 +2,123 @@
var ReadPreference = require('./read_preference'),
parser = require('url'),
f = require('util').format;
f = require('util').format,
assign = require('./utils').assign,
dns = require('dns');
module.exports = function(url) {
module.exports = function(url, options, callback) {
if (typeof options === 'function') (callback = options), (options = {});
options = options || {};
var result = parser.parse(url, true);
if (result.protocol !== 'mongodb:' && result.protocol !== 'mongodb+srv:') {
return callback(new Error('invalid schema, expected mongodb or mongodb+srv'));
}
if (result.protocol === 'mongodb+srv:') {
if (result.hostname.split('.').length < 3) {
return callback(new Error('uri does not have hostname, domainname and tld'));
}
result.domainLength = result.hostname.split('.').length;
if (result.pathname && result.pathname.match(',')) {
return callback(new Error('invalid uri, cannot contain multiple hostnames'));
}
if (result.port) {
return callback(new Error('Ports not accepted with mongodb+srv'));
}
var srvAddress = '_mongodb._tcp.' + result.host;
dns.resolveSrv(srvAddress, function(err, addresses) {
if (err) return callback(err);
if (addresses.length === 0) {
return callback(new Error('No addresses found at host'));
}
for (var i = 0; i < addresses.length; i++) {
if (!matchesParentDomain(addresses[i].name, result.hostname, result.domainLength)) {
return callback(new Error('srv record does not share hostname with parent uri'));
}
}
var base = result.auth ? 'mongodb://' + result.auth + '@' : 'mongodb://';
var connectionStrings = addresses.map(function(address, i) {
if (i === 0) return base + address.name + ':' + address.port;
else return address.name + ':' + address.port;
});
var connectionString = connectionStrings.join(',') + '/';
var connectionStringOptions = [];
// Default to SSL true
if (!options.ssl && !result.search) {
connectionStringOptions.push('ssl=true');
} else if (!options.ssl && result.search && !result.search.match('ssl')) {
connectionStringOptions.push('ssl=true');
}
// Keep original uri options
if (result.search) {
connectionStringOptions.push(result.search.replace('?', ''));
}
dns.resolveTxt(result.host, function(err, record) {
if (err && err.code !== 'ENODATA') return callback(err);
if (err && err.code === 'ENODATA') record = null;
if (record) {
if (record.length > 1) {
return callback(new Error('multiple text records not allowed'));
}
record = record[0];
if (record.length > 1) record = record.join('');
else record = record[0];
if (!record.includes('authSource') && !record.includes('replicaSet')) {
return callback(new Error('text record must only set `authSource` or `replicaSet`'));
}
connectionStringOptions.push(record);
}
// Add any options to the connection string
if (connectionStringOptions.length) {
connectionString += '?' + connectionStringOptions.join('&');
}
parseHandler(connectionString, options, callback);
});
});
} else {
parseHandler(url, options, callback);
}
};
function matchesParentDomain(srvAddress, parentDomain) {
var regex = /^.*?\./;
var srv = '.' + srvAddress.replace(regex, '');
var parent = '.' + parentDomain.replace(regex, '');
if (srv.endsWith(parent)) return true;
else return false;
}
function parseHandler(address, options, callback) {
var result, err;
try {
result = parseConnectionString(address, options);
} catch (e) {
err = e;
}
return err ? callback(err, null) : callback(null, result);
}
function parseConnectionString(url, options) {
// Variables
var connection_part = '';
var auth_part = '';
@ -14,10 +128,6 @@ module.exports = function(url) {
// Url parser result
var result = parser.parse(url, true);
if(result.protocol != 'mongodb:') {
throw new Error('invalid schema, expected mongodb');
}
if((result.hostname == null || result.hostname == '') && url.indexOf('.sock') == -1) {
throw new Error('no hostname or hostnames provided in connection string');
}
@ -86,7 +196,13 @@ module.exports = function(url) {
for(i = 0; i < hosts.length; i++) {
var r = parser.parse(f('mongodb://%s', hosts[i].trim()));
if(r.path && r.path.indexOf(':') != -1) {
throw new Error('double colon in host identifier');
// Not connecting to a socket so check for an extra slash in the hostname.
// Using String#split as perf is better than match.
if (r.path.split('/').length > 1) {
throw new Error('slash in host identifier');
} else {
throw new Error('double colon in host identifier');
}
}
}
@ -144,6 +260,8 @@ module.exports = function(url) {
// Add auth to final object if we have 2 elements
if(auth.length == 2) object.auth = {user: auth[0], password: auth[1]};
// if user provided auth options, use that
if(options && options.auth != null) object.auth = options.auth;
// Variables used for temporary storage
var hostPart;
@ -190,7 +308,7 @@ module.exports = function(url) {
if(_host.indexOf("?") != -1) _host = _host.split(/\?/)[0];
}
// No entry returned for duplicate servr
// No entry returned for duplicate server
if(deduplicatedServers[_host + "_" + _port]) return null;
deduplicatedServers[_host + "_" + _port] = 1;
@ -398,8 +516,12 @@ module.exports = function(url) {
dbOptions.readPreference = 'primary';
}
// make sure that user-provided options are applied with priority
dbOptions = assign(dbOptions, options);
// Add servers to result
object.servers = servers;
// Returned parsed object
return object;
}

View file

@ -1,6 +1,8 @@
"use strict";
var MongoError = require('mongodb-core').MongoError
var MongoError = require('mongodb-core').MongoError,
ReadPreference = require('./read_preference'),
CoreReadPreference = require('mongodb-core').ReadPreference;
var shallowClone = function(obj) {
var copy = {};
@ -8,6 +10,29 @@ var shallowClone = function(obj) {
return copy;
}
// Figure out the read preference
var getReadPreference = function(options) {
var r = null
if(options.readPreference) {
r = options.readPreference
} else {
return options;
}
if(r instanceof ReadPreference) {
options.readPreference = new CoreReadPreference(r.mode, r.tags, {maxStalenessSeconds: r.maxStalenessSeconds});
} else if(typeof r == 'string') {
options.readPreference = new CoreReadPreference(r);
} else if(r && !(r instanceof ReadPreference) && typeof r == 'object') {
var mode = r.mode || r.preference;
if (mode && typeof mode == 'string') {
options.readPreference = new CoreReadPreference(mode, r.tags, {maxStalenessSeconds: r.maxStalenessSeconds});
}
}
return options;
}
// Set simple property
var getSingleProperty = function(obj, name, value) {
Object.defineProperty(obj, name, {
@ -66,25 +91,25 @@ var formattedOrderClause = exports.formattedOrderClause = function(sortValue) {
var checkCollectionName = function checkCollectionName (collectionName) {
if('string' !== typeof collectionName) {
throw Error("collection name must be a String");
throw new MongoError("collection name must be a String");
}
if(!collectionName || collectionName.indexOf('..') != -1) {
throw Error("collection names cannot be empty");
throw new MongoError("collection names cannot be empty");
}
if(collectionName.indexOf('$') != -1 &&
collectionName.match(/((^\$cmd)|(oplog\.\$main))/) == null) {
throw Error("collection names must not contain '$'");
throw new MongoError("collection names must not contain '$'");
}
if(collectionName.match(/^\.|\.$/) != null) {
throw Error("collection names must not start or end with '.'");
throw new MongoError("collection names must not start or end with '.'");
}
// Validate that we are not passing 0x00 in the colletion name
// Validate that we are not passing 0x00 in the collection name
if(!!~collectionName.indexOf("\x00")) {
throw new Error("collection names cannot contain a null character");
throw new MongoError("collection names cannot contain a null character");
}
};
@ -205,7 +230,7 @@ var parseIndexOptions = function(fieldOrSpec) {
}
var isObject = exports.isObject = function (arg) {
return '[object Object]' == toString.call(arg)
return '[object Object]' == Object.prototype.toString.call(arg)
}
var debugOptions = function(debugFields, options) {
@ -237,7 +262,8 @@ var mergeOptions = function(target, source) {
var translateOptions = function(target, source) {
var translations = {
// SSL translation options
'sslCA': 'ca', 'sslValidate': 'rejectUnauthorized', 'sslKey': 'key', 'sslCert': 'cert', 'sslPass': 'passphrase',
'sslCA': 'ca', 'sslCRL': 'crl', 'sslValidate': 'rejectUnauthorized', 'sslKey': 'key',
'sslCert': 'cert', 'sslPass': 'passphrase',
// SocketTimeout translation options
'socketTimeoutMS': 'socketTimeout', 'connectTimeoutMS': 'connectionTimeout',
// Replicaset options
@ -269,7 +295,7 @@ var filterOptions = function(options, names) {
return filterOptions;
}
// Object.assign method or polyfille
// Object.assign method or polyfill
var assign = Object.assign ? Object.assign : function assign(target) {
if (target === undefined || target === null) {
throw new TypeError('Cannot convert first argument to object');
@ -294,6 +320,41 @@ var assign = Object.assign ? Object.assign : function assign(target) {
return to;
}
// Write concern keys
var writeConcernKeys = ['w', 'j', 'wtimeout', 'fsync'];
// Merge the write concern options
var mergeOptionsAndWriteConcern = function(targetOptions, sourceOptions, keys, mergeWriteConcern) {
// Mix in any allowed options
for(var i = 0; i < keys.length; i++) {
if(!targetOptions[keys[i]] && sourceOptions[keys[i]] != undefined) {
targetOptions[keys[i]] = sourceOptions[keys[i]];
}
}
// No merging of write concern
if(!mergeWriteConcern) return targetOptions;
// Found no write Concern options
var found = false;
for(var i = 0; i < writeConcernKeys.length; i++) {
if(targetOptions[writeConcernKeys[i]]) {
found = true;
break;
}
}
if(!found) {
for(var i = 0; i < writeConcernKeys.length; i++) {
if(sourceOptions[writeConcernKeys[i]]) {
targetOptions[writeConcernKeys[i]] = sourceOptions[writeConcernKeys[i]];
}
}
}
return targetOptions;
}
exports.filterOptions = filterOptions;
exports.mergeOptions = mergeOptions;
exports.translateOptions = translateOptions;
@ -310,3 +371,5 @@ exports.isObject = isObject;
exports.debugOptions = debugOptions;
exports.MAX_JS_INT = 0x20000000000000;
exports.assign = assign;
exports.mergeOptionsAndWriteConcern = mergeOptionsAndWriteConcern;
exports.getReadPreference = getReadPreference;

View file

@ -1,49 +1,27 @@
{
"_args": [
[
{
"raw": "mongodb@2.2.12",
"scope": null,
"escapedName": "mongodb",
"name": "mongodb",
"rawSpec": "2.2.12",
"spec": "2.2.12",
"type": "version"
},
"/Users/sclay/projects/newsblur/node"
]
],
"_from": "mongodb@2.2.12",
"_id": "mongodb@2.2.12",
"_inCache": true,
"_from": "mongodb@^2.2.12",
"_id": "mongodb@2.2.36",
"_inBundle": false,
"_integrity": "sha512-P2SBLQ8Z0PVx71ngoXwo12+FiSfbNfGOClAao03/bant5DgLNkOPAck5IaJcEk4gKlQhDEURzfR3xuBG1/B+IA==",
"_location": "/mongodb",
"_nodeVersion": "7.1.0",
"_npmOperationalInternal": {
"host": "packages-12-west.internal.npmjs.com",
"tmp": "tmp/mongodb-2.2.12.tgz_1480415483308_0.9239382324740291"
},
"_npmUser": {
"name": "christkv",
"email": "christkv@gmail.com"
},
"_npmVersion": "3.10.9",
"_phantomChildren": {},
"_requested": {
"raw": "mongodb@2.2.12",
"scope": null,
"escapedName": "mongodb",
"type": "range",
"registry": true,
"raw": "mongodb@^2.2.12",
"name": "mongodb",
"rawSpec": "2.2.12",
"spec": "2.2.12",
"type": "version"
"escapedName": "mongodb",
"rawSpec": "^2.2.12",
"saveSpec": null,
"fetchSpec": "^2.2.12"
},
"_requiredBy": [
"#USER"
"#USER",
"/"
],
"_resolved": "https://registry.npmjs.org/mongodb/-/mongodb-2.2.12.tgz",
"_shasum": "2a86f10228f911e9d6fefdbd7d922188d7b730f9",
"_shrinkwrap": null,
"_spec": "mongodb@2.2.12",
"_resolved": "https://registry.npmjs.org/mongodb/-/mongodb-2.2.36.tgz",
"_shasum": "1c573680b2849fb0f47acbba3dc5fa228de975f5",
"_spec": "mongodb@^2.2.12",
"_where": "/Users/sclay/projects/newsblur/node",
"author": {
"name": "Christian Kvalheim"
@ -51,20 +29,23 @@
"bugs": {
"url": "https://github.com/mongodb/node-mongodb-native/issues"
},
"bundleDependencies": false,
"dependencies": {
"es6-promise": "3.2.1",
"mongodb-core": "2.0.14",
"readable-stream": "2.1.5"
"mongodb-core": "2.1.20",
"readable-stream": "2.2.7"
},
"deprecated": false,
"description": "The official MongoDB driver for Node.js",
"devDependencies": {
"JSONStream": "^1.0.7",
"betterbenchmarks": "^0.1.0",
"bluebird": "3.4.6",
"bson": "^0.5.1",
"bson": "latest",
"cli-table": "^0.3.1",
"co": "4.6.0",
"colors": "^1.1.2",
"conventional-changelog-cli": "^1.3.5",
"coveralls": "^2.11.6",
"eslint": "^3.8.1",
"event-stream": "^3.3.2",
@ -81,15 +62,9 @@
"semver": "5.3.0",
"worker-farm": "^1.3.1"
},
"directories": {},
"dist": {
"shasum": "2a86f10228f911e9d6fefdbd7d922188d7b730f9",
"tarball": "https://registry.npmjs.org/mongodb/-/mongodb-2.2.12.tgz"
},
"engines": {
"node": ">=0.10.3"
},
"gitHead": "4a771514a4b53dd535a1c3903bf49591c209a33f",
"homepage": "https://github.com/mongodb/node-mongodb-native",
"keywords": [
"mongodb",
@ -98,28 +73,21 @@
],
"license": "Apache-2.0",
"main": "index.js",
"maintainers": [
{
"name": "christkv",
"email": "christkv@gmail.com"
}
],
"name": "mongodb",
"nyc": {
"include": [
"lib/**/*.js"
]
},
"optionalDependencies": {},
"readme": "ERROR: No README data found!",
"repository": {
"type": "git",
"url": "git+ssh://git@github.com/mongodb/node-mongodb-native.git"
},
"scripts": {
"changelog": "conventional-changelog -p angular -i HISTORY.md -s",
"coverage": "nyc node test/runner.js -t functional && node_modules/.bin/nyc report --reporter=text-lcov | node_modules/.bin/coveralls",
"lint": "eslint lib",
"test": "node test/runner.js -t functional"
},
"version": "2.2.12"
"version": "2.2.36"
}

3176
node/node_modules/mongodb/yarn.lock generated vendored

File diff suppressed because it is too large Load diff

View file

@ -6,3 +6,4 @@ zlib.js
.zuul.yml
.nyc_output
coverage
docs/

View file

@ -28,8 +28,8 @@ matrix:
env: TASK=test
- node_js: 6
env: TASK=test
- node_js: 5
env: TASK=browser BROWSER_NAME=android BROWSER_VERSION="4.0..latest"
- node_js: 7
env: TASK=test
- node_js: 5
env: TASK=browser BROWSER_NAME=ie BROWSER_VERSION="9..latest"
- node_js: 5

38
node/node_modules/readable-stream/CONTRIBUTING.md generated vendored Normal file
View file

@ -0,0 +1,38 @@
# Developer's Certificate of Origin 1.1
By making a contribution to this project, I certify that:
* (a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
* (b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
* (c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
* (d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.
## Moderation Policy
The [Node.js Moderation Policy] applies to this WG.
## Code of Conduct
The [Node.js Code of Conduct][] applies to this WG.
[Node.js Code of Conduct]:
https://github.com/nodejs/node/blob/master/CODE_OF_CONDUCT.md
[Node.js Moderation Policy]:
https://github.com/nodejs/TSC/blob/master/Moderation-Policy.md

136
node/node_modules/readable-stream/GOVERNANCE.md generated vendored Normal file
View file

@ -0,0 +1,136 @@
### Streams Working Group
The Node.js Streams is jointly governed by a Working Group
(WG)
that is responsible for high-level guidance of the project.
The WG has final authority over this project including:
* Technical direction
* Project governance and process (including this policy)
* Contribution policy
* GitHub repository hosting
* Conduct guidelines
* Maintaining the list of additional Collaborators
For the current list of WG members, see the project
[README.md](./README.md#current-project-team-members).
### Collaborators
The readable-stream GitHub repository is
maintained by the WG and additional Collaborators who are added by the
WG on an ongoing basis.
Individuals making significant and valuable contributions are made
Collaborators and given commit-access to the project. These
individuals are identified by the WG and their addition as
Collaborators is discussed during the WG meeting.
_Note:_ If you make a significant contribution and are not considered
for commit-access log an issue or contact a WG member directly and it
will be brought up in the next WG meeting.
Modifications of the contents of the readable-stream repository are
made on
a collaborative basis. Anybody with a GitHub account may propose a
modification via pull request and it will be considered by the project
Collaborators. All pull requests must be reviewed and accepted by a
Collaborator with sufficient expertise who is able to take full
responsibility for the change. In the case of pull requests proposed
by an existing Collaborator, an additional Collaborator is required
for sign-off. Consensus should be sought if additional Collaborators
participate and there is disagreement around a particular
modification. See _Consensus Seeking Process_ below for further detail
on the consensus model used for governance.
Collaborators may opt to elevate significant or controversial
modifications, or modifications that have not found consensus to the
WG for discussion by assigning the ***WG-agenda*** tag to a pull
request or issue. The WG should serve as the final arbiter where
required.
For the current list of Collaborators, see the project
[README.md](./README.md#members).
### WG Membership
WG seats are not time-limited. There is no fixed size of the WG.
However, the expected target is between 6 and 12, to ensure adequate
coverage of important areas of expertise, balanced with the ability to
make decisions efficiently.
There is no specific set of requirements or qualifications for WG
membership beyond these rules.
The WG may add additional members to the WG by unanimous consensus.
A WG member may be removed from the WG by voluntary resignation, or by
unanimous consensus of all other WG members.
Changes to WG membership should be posted in the agenda, and may be
suggested as any other agenda item (see "WG Meetings" below).
If an addition or removal is proposed during a meeting, and the full
WG is not in attendance to participate, then the addition or removal
is added to the agenda for the subsequent meeting. This is to ensure
that all members are given the opportunity to participate in all
membership decisions. If a WG member is unable to attend a meeting
where a planned membership decision is being made, then their consent
is assumed.
No more than 1/3 of the WG members may be affiliated with the same
employer. If removal or resignation of a WG member, or a change of
employment by a WG member, creates a situation where more than 1/3 of
the WG membership shares an employer, then the situation must be
immediately remedied by the resignation or removal of one or more WG
members affiliated with the over-represented employer(s).
### WG Meetings
The WG meets occasionally on a Google Hangout On Air. A designated moderator
approved by the WG runs the meeting. Each meeting should be
published to YouTube.
Items are added to the WG agenda that are considered contentious or
are modifications of governance, contribution policy, WG membership,
or release process.
The intention of the agenda is not to approve or review all patches;
that should happen continuously on GitHub and be handled by the larger
group of Collaborators.
Any community member or contributor can ask that something be added to
the next meeting's agenda by logging a GitHub Issue. Any Collaborator,
WG member or the moderator can add the item to the agenda by adding
the ***WG-agenda*** tag to the issue.
Prior to each WG meeting the moderator will share the Agenda with
members of the WG. WG members can add any items they like to the
agenda at the beginning of each meeting. The moderator and the WG
cannot veto or remove items.
The WG may invite persons or representatives from certain projects to
participate in a non-voting capacity.
The moderator is responsible for summarizing the discussion of each
agenda item and sends it as a pull request after the meeting.
### Consensus Seeking Process
The WG follows a
[Consensus
Seeking](http://en.wikipedia.org/wiki/Consensus-seeking_decision-making)
decision-making model.
When an agenda item has appeared to reach a consensus the moderator
will ask "Does anyone object?" as a final call for dissent from the
consensus.
If an agenda item cannot reach a consensus a WG member can call for
either a closing vote or a vote to table the issue to the next
meeting. The call for a vote must be seconded by a majority of the WG
or else the discussion will continue. Simple majority wins.
Note that changes to WG membership require a majority consensus. See
"WG Membership" above.

View file

@ -1,3 +1,31 @@
Node.js is licensed for use as follows:
"""
Copyright Node.js contributors. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to
deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.
"""
This license applies to parts of Node.js originating from the
https://github.com/joyent/node repository:
"""
Copyright Joyent, Inc. and other Node contributors. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to
@ -16,3 +44,4 @@ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.
"""

View file

@ -1,6 +1,6 @@
# readable-stream
***Node-core v6.3.1 streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream)
***Node-core v7.0.0 streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream)
[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)
@ -16,14 +16,33 @@ npm install --save readable-stream
***Node-core streams for userland***
This package is a mirror of the Streams2 and Streams3 implementations in
Node-core, including [documentation](doc/stream.md).
Node-core.
Full documentation may be found on the [Node.js website](https://nodejs.org/dist/v7.7.3/docs/api/).
If you want to guarantee a stable streams base, regardless of what version of
Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html).
As of version 2.0.0 **readable-stream** uses semantic versioning.
As of version 2.0.0 **readable-stream** uses semantic versioning.
# Streams WG Team Members
# Streams Working Group
`readable-stream` is maintained by the Streams Working Group, which
oversees the development and maintenance of the Streams API within
Node.js. The responsibilities of the Streams Working Group include:
* Addressing stream issues on the Node.js issue tracker.
* Authoring and editing stream documentation within the Node.js project.
* Reviewing changes to stream subclasses within the Node.js project.
* Redirecting changes to streams from the Node.js project to this
project.
* Assisting in the implementation of stream providers within Node.js.
* Recommending versions of `readable-stream` to be included in Node.js.
* Messaging about the future of streams to give the community advance
notice of changes.
<a name="members"></a>
## Team Members
* **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) &lt;christopher.s.dickinson@gmail.com&gt;
- Release GPG key: 9554F04D7259F04124DE6B476D5A82AC7E37093B
@ -34,3 +53,5 @@ As of version 2.0.0 **readable-stream** uses semantic versioning.
* **Sam Newman** ([@sonewman](https://github.com/sonewman)) &lt;newmansam@outlook.com&gt;
* **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) &lt;mathiasbuus@gmail.com&gt;
* **Domenic Denicola** ([@domenic](https://github.com/domenic)) &lt;d@domenic.me&gt;
* **Matteo Collina** ([@mcollina](https://github.com/mcollina)) &lt;matteo.collina@gmail.com&gt;
- Release GPG key: 3ABC01543F22DD2239285CDD818674489FBC127E

File diff suppressed because it is too large Load diff

1
node/node_modules/readable-stream/duplex-browser.js generated vendored Normal file
View file

@ -0,0 +1 @@
module.exports = require('./lib/_stream_duplex.js');

View file

@ -1 +1 @@
module.exports = require("./lib/_stream_duplex.js")
module.exports = require('./readable').Duplex

View file

@ -10,6 +10,10 @@ var processNextTick = require('process-nextick-args');
var isArray = require('isarray');
/*</replacement>*/
/*<replacement>*/
var Duplex;
/*</replacement>*/
Readable.ReadableState = ReadableState;
/*<replacement>*/
@ -57,6 +61,8 @@ var StringDecoder;
util.inherits(Readable, Stream);
function prependListener(emitter, event, fn) {
// Sadly this is not cacheable as some libraries bundle their own
// event emitter implementation with them.
if (typeof emitter.prependListener === 'function') {
return emitter.prependListener(event, fn);
} else {
@ -68,7 +74,6 @@ function prependListener(emitter, event, fn) {
}
}
var Duplex;
function ReadableState(options, stream) {
Duplex = Duplex || require('./_stream_duplex');
@ -87,7 +92,7 @@ function ReadableState(options, stream) {
this.highWaterMark = hwm || hwm === 0 ? hwm : defaultHwm;
// cast to ints.
this.highWaterMark = ~ ~this.highWaterMark;
this.highWaterMark = ~~this.highWaterMark;
// A linked list is used to store data chunks instead of an array because the
// linked list can remove elements from the beginning faster than
@ -138,7 +143,6 @@ function ReadableState(options, stream) {
}
}
var Duplex;
function Readable(options) {
Duplex = Duplex || require('./_stream_duplex');
@ -461,7 +465,7 @@ function maybeReadMore_(stream, state) {
// for virtual (non-string, non-buffer) streams, "length" is somewhat
// arbitrary, and perhaps not very meaningful.
Readable.prototype._read = function (n) {
this.emit('error', new Error('not implemented'));
this.emit('error', new Error('_read() is not implemented'));
};
Readable.prototype.pipe = function (dest, pipeOpts) {
@ -639,16 +643,16 @@ Readable.prototype.unpipe = function (dest) {
state.pipesCount = 0;
state.flowing = false;
for (var _i = 0; _i < len; _i++) {
dests[_i].emit('unpipe', this);
for (var i = 0; i < len; i++) {
dests[i].emit('unpipe', this);
}return this;
}
// try to find the right one.
var i = indexOf(state.pipes, dest);
if (i === -1) return this;
var index = indexOf(state.pipes, dest);
if (index === -1) return this;
state.pipes.splice(i, 1);
state.pipes.splice(index, 1);
state.pipesCount -= 1;
if (state.pipesCount === 1) state.pipes = state.pipes[0];

View file

@ -94,7 +94,6 @@ function Transform(options) {
this._transformState = new TransformState(this);
// when the writable side finishes, then flush out anything remaining.
var stream = this;
// start out asking for a readable event once data is transformed.
@ -111,9 +110,10 @@ function Transform(options) {
if (typeof options.flush === 'function') this._flush = options.flush;
}
// When the writable side finishes, then flush out anything remaining.
this.once('prefinish', function () {
if (typeof this._flush === 'function') this._flush(function (er) {
done(stream, er);
if (typeof this._flush === 'function') this._flush(function (er, data) {
done(stream, er, data);
});else done(stream);
});
}
@ -134,7 +134,7 @@ Transform.prototype.push = function (chunk, encoding) {
// an error, then that'll put the hurt on the whole operation. If you
// never call cb(), then you'll never get another chunk.
Transform.prototype._transform = function (chunk, encoding, cb) {
throw new Error('Not implemented');
throw new Error('_transform() is not implemented');
};
Transform.prototype._write = function (chunk, encoding, cb) {
@ -164,9 +164,11 @@ Transform.prototype._read = function (n) {
}
};
function done(stream, er) {
function done(stream, er, data) {
if (er) return stream.emit('error', er);
if (data !== null && data !== undefined) stream.push(data);
// if there's nothing in the write buffer, then that means
// that nothing more will ever be provided
var ws = stream._writableState;

View file

@ -14,6 +14,10 @@ var processNextTick = require('process-nextick-args');
var asyncWrite = !process.browser && ['v0.10', 'v0.9.'].indexOf(process.version.slice(0, 5)) > -1 ? setImmediate : processNextTick;
/*</replacement>*/
/*<replacement>*/
var Duplex;
/*</replacement>*/
Writable.WritableState = WritableState;
/*<replacement>*/
@ -54,7 +58,6 @@ function WriteReq(chunk, encoding, cb) {
this.next = null;
}
var Duplex;
function WritableState(options, stream) {
Duplex = Duplex || require('./_stream_duplex');
@ -74,8 +77,9 @@ function WritableState(options, stream) {
this.highWaterMark = hwm || hwm === 0 ? hwm : defaultHwm;
// cast to ints.
this.highWaterMark = ~ ~this.highWaterMark;
this.highWaterMark = ~~this.highWaterMark;
// drain event flag.
this.needDrain = false;
// at the start of calling end()
this.ending = false;
@ -150,7 +154,7 @@ function WritableState(options, stream) {
this.corkedRequestsFree = new CorkedRequest(this);
}
WritableState.prototype.getBuffer = function writableStateGetBuffer() {
WritableState.prototype.getBuffer = function getBuffer() {
var current = this.bufferedRequest;
var out = [];
while (current) {
@ -170,13 +174,37 @@ WritableState.prototype.getBuffer = function writableStateGetBuffer() {
} catch (_) {}
})();
var Duplex;
// Test _writableState for inheritance to account for Duplex streams,
// whose prototype chain only points to Readable.
var realHasInstance;
if (typeof Symbol === 'function' && Symbol.hasInstance && typeof Function.prototype[Symbol.hasInstance] === 'function') {
realHasInstance = Function.prototype[Symbol.hasInstance];
Object.defineProperty(Writable, Symbol.hasInstance, {
value: function (object) {
if (realHasInstance.call(this, object)) return true;
return object && object._writableState instanceof WritableState;
}
});
} else {
realHasInstance = function (object) {
return object instanceof this;
};
}
function Writable(options) {
Duplex = Duplex || require('./_stream_duplex');
// Writable ctor is applied to Duplexes, though they're not
// instanceof Writable, they're instanceof Readable.
if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options);
// Writable ctor is applied to Duplexes, too.
// `realHasInstance` is necessary because using plain `instanceof`
// would return false, as no `_writableState` property is attached.
// Trying to use the custom `instanceof` for Writable here will also break the
// Node.js LazyTransform implementation, which has a non-trivial getter for
// `_writableState` that would lead to infinite recursion.
if (!realHasInstance.call(Writable, this) && !(this instanceof Duplex)) {
return new Writable(options);
}
this._writableState = new WritableState(options, this);
@ -204,20 +232,16 @@ function writeAfterEnd(stream, cb) {
processNextTick(cb, er);
}
// If we get something that is not a buffer, string, null, or undefined,
// and we're not in objectMode, then that's an error.
// Otherwise stream chunks are all considered to be of length=1, and the
// watermarks determine how many objects to keep in the buffer, rather than
// how many bytes or characters.
// Checks that a user-supplied chunk is valid, especially for the particular
// mode the stream is in. Currently this means that `null` is never accepted
// and undefined/non-string values are only allowed in object mode.
function validChunk(stream, state, chunk, cb) {
var valid = true;
var er = false;
// Always throw error if a null is written
// if we are not in object mode then throw
// if it is not a buffer, string, or undefined.
if (chunk === null) {
er = new TypeError('May not write null values to stream');
} else if (!Buffer.isBuffer(chunk) && typeof chunk !== 'string' && chunk !== undefined && !state.objectMode) {
} else if (typeof chunk !== 'string' && chunk !== undefined && !state.objectMode) {
er = new TypeError('Invalid non-string/buffer chunk');
}
if (er) {
@ -231,19 +255,20 @@ function validChunk(stream, state, chunk, cb) {
Writable.prototype.write = function (chunk, encoding, cb) {
var state = this._writableState;
var ret = false;
var isBuf = Buffer.isBuffer(chunk);
if (typeof encoding === 'function') {
cb = encoding;
encoding = null;
}
if (Buffer.isBuffer(chunk)) encoding = 'buffer';else if (!encoding) encoding = state.defaultEncoding;
if (isBuf) encoding = 'buffer';else if (!encoding) encoding = state.defaultEncoding;
if (typeof cb !== 'function') cb = nop;
if (state.ended) writeAfterEnd(this, cb);else if (validChunk(this, state, chunk, cb)) {
if (state.ended) writeAfterEnd(this, cb);else if (isBuf || validChunk(this, state, chunk, cb)) {
state.pendingcb++;
ret = writeOrBuffer(this, state, chunk, encoding, cb);
ret = writeOrBuffer(this, state, isBuf, chunk, encoding, cb);
}
return ret;
@ -283,10 +308,11 @@ function decodeChunk(state, chunk, encoding) {
// if we're already writing something, then just put this
// in the queue, and wait our turn. Otherwise, call _write
// If we return false, then we need a drain event, so set that flag.
function writeOrBuffer(stream, state, chunk, encoding, cb) {
chunk = decodeChunk(state, chunk, encoding);
if (Buffer.isBuffer(chunk)) encoding = 'buffer';
function writeOrBuffer(stream, state, isBuf, chunk, encoding, cb) {
if (!isBuf) {
chunk = decodeChunk(state, chunk, encoding);
if (Buffer.isBuffer(chunk)) encoding = 'buffer';
}
var len = state.objectMode ? 1 : chunk.length;
state.length += len;
@ -355,8 +381,8 @@ function onwrite(stream, er) {
asyncWrite(afterWrite, stream, state, finished, cb);
/*</replacement>*/
} else {
afterWrite(stream, state, finished, cb);
}
afterWrite(stream, state, finished, cb);
}
}
}
@ -436,7 +462,7 @@ function clearBuffer(stream, state) {
}
Writable.prototype._write = function (chunk, encoding, cb) {
cb(new Error('not implemented'));
cb(new Error('_write() is not implemented'));
};
Writable.prototype._writev = null;
@ -507,7 +533,6 @@ function CorkedRequest(state) {
this.next = null;
this.entry = null;
this.finish = function (err) {
var entry = _this.entry;
_this.entry = null;

View file

@ -0,0 +1,2 @@
build
test

View file

@ -0,0 +1,48 @@
Node.js is licensed for use as follows:
"""
Copyright Node.js contributors. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to
deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.
"""
This license applies to parts of Node.js originating from the
https://github.com/joyent/node repository:
"""
Copyright Joyent, Inc. and other Node contributors. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to
deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.
"""

View file

@ -0,0 +1,28 @@
# string_decoder
***Node-core v7.0.0 string_decoder for userland***
[![NPM](https://nodei.co/npm/string_decoder.png?downloads=true&downloadRank=true)](https://nodei.co/npm/string_decoder/)
[![NPM](https://nodei.co/npm-dl/string_decoder.png?&months=6&height=3)](https://nodei.co/npm/string_decoder/)
```bash
npm install --save string_decoder
```
***Node-core string_decoderstring_decoder for userland***
This package is a mirror of the string_decoder implementation in Node-core.
Full documentation may be found on the [Node.js website](https://nodejs.org/dist/v7.8.0/docs/api/).
As of version 1.0.0 **string_decoder** uses semantic versioning.
## Previous versions
Previous version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10.
## Update
The *build/* directory contains a build script that will scrape the source from the [nodejs/node](https://github.com/nodejs/node) repo given a specific Node version.

View file

@ -0,0 +1,272 @@
'use strict';
var Buffer = require('safe-buffer').Buffer;
var isEncoding = Buffer.isEncoding || function (encoding) {
encoding = '' + encoding;
switch (encoding && encoding.toLowerCase()) {
case 'hex':case 'utf8':case 'utf-8':case 'ascii':case 'binary':case 'base64':case 'ucs2':case 'ucs-2':case 'utf16le':case 'utf-16le':case 'raw':
return true;
default:
return false;
}
};
function _normalizeEncoding(enc) {
if (!enc) return 'utf8';
var retried;
while (true) {
switch (enc) {
case 'utf8':
case 'utf-8':
return 'utf8';
case 'ucs2':
case 'ucs-2':
case 'utf16le':
case 'utf-16le':
return 'utf16le';
case 'latin1':
case 'binary':
return 'latin1';
case 'base64':
case 'ascii':
case 'hex':
return enc;
default:
if (retried) return; // undefined
enc = ('' + enc).toLowerCase();
retried = true;
}
}
};
// Do not cache `Buffer.isEncoding` when checking encoding names as some
// modules monkey-patch it to support additional encodings
function normalizeEncoding(enc) {
var nenc = _normalizeEncoding(enc);
if (typeof nenc !== 'string' && (Buffer.isEncoding === isEncoding || !isEncoding(enc))) throw new Error('Unknown encoding: ' + enc);
return nenc || enc;
}
// StringDecoder provides an interface for efficiently splitting a series of
// buffers into a series of JS strings without breaking apart multi-byte
// characters.
exports.StringDecoder = StringDecoder;
function StringDecoder(encoding) {
this.encoding = normalizeEncoding(encoding);
var nb;
switch (this.encoding) {
case 'utf16le':
this.text = utf16Text;
this.end = utf16End;
nb = 4;
break;
case 'utf8':
this.fillLast = utf8FillLast;
nb = 4;
break;
case 'base64':
this.text = base64Text;
this.end = base64End;
nb = 3;
break;
default:
this.write = simpleWrite;
this.end = simpleEnd;
return;
}
this.lastNeed = 0;
this.lastTotal = 0;
this.lastChar = Buffer.allocUnsafe(nb);
}
StringDecoder.prototype.write = function (buf) {
if (buf.length === 0) return '';
var r;
var i;
if (this.lastNeed) {
r = this.fillLast(buf);
if (r === undefined) return '';
i = this.lastNeed;
this.lastNeed = 0;
} else {
i = 0;
}
if (i < buf.length) return r ? r + this.text(buf, i) : this.text(buf, i);
return r || '';
};
StringDecoder.prototype.end = utf8End;
// Returns only complete characters in a Buffer
StringDecoder.prototype.text = utf8Text;
// Attempts to complete a partial non-UTF-8 character using bytes from a Buffer
StringDecoder.prototype.fillLast = function (buf) {
if (this.lastNeed <= buf.length) {
buf.copy(this.lastChar, this.lastTotal - this.lastNeed, 0, this.lastNeed);
return this.lastChar.toString(this.encoding, 0, this.lastTotal);
}
buf.copy(this.lastChar, this.lastTotal - this.lastNeed, 0, buf.length);
this.lastNeed -= buf.length;
};
// Checks the type of a UTF-8 byte, whether it's ASCII, a leading byte, or a
// continuation byte.
function utf8CheckByte(byte) {
if (byte <= 0x7F) return 0;else if (byte >> 5 === 0x06) return 2;else if (byte >> 4 === 0x0E) return 3;else if (byte >> 3 === 0x1E) return 4;
return -1;
}
// Checks at most 3 bytes at the end of a Buffer in order to detect an
// incomplete multi-byte UTF-8 character. The total number of bytes (2, 3, or 4)
// needed to complete the UTF-8 character (if applicable) are returned.
function utf8CheckIncomplete(self, buf, i) {
var j = buf.length - 1;
if (j < i) return 0;
var nb = utf8CheckByte(buf[j]);
if (nb >= 0) {
if (nb > 0) self.lastNeed = nb - 1;
return nb;
}
if (--j < i) return 0;
nb = utf8CheckByte(buf[j]);
if (nb >= 0) {
if (nb > 0) self.lastNeed = nb - 2;
return nb;
}
if (--j < i) return 0;
nb = utf8CheckByte(buf[j]);
if (nb >= 0) {
if (nb > 0) {
if (nb === 2) nb = 0;else self.lastNeed = nb - 3;
}
return nb;
}
return 0;
}
// Validates as many continuation bytes for a multi-byte UTF-8 character as
// needed or are available. If we see a non-continuation byte where we expect
// one, we "replace" the validated continuation bytes we've seen so far with
// UTF-8 replacement characters ('\ufffd'), to match v8's UTF-8 decoding
// behavior. The continuation byte check is included three times in the case
// where all of the continuation bytes for a character exist in the same buffer.
// It is also done this way as a slight performance increase instead of using a
// loop.
function utf8CheckExtraBytes(self, buf, p) {
if ((buf[0] & 0xC0) !== 0x80) {
self.lastNeed = 0;
return '\ufffd'.repeat(p);
}
if (self.lastNeed > 1 && buf.length > 1) {
if ((buf[1] & 0xC0) !== 0x80) {
self.lastNeed = 1;
return '\ufffd'.repeat(p + 1);
}
if (self.lastNeed > 2 && buf.length > 2) {
if ((buf[2] & 0xC0) !== 0x80) {
self.lastNeed = 2;
return '\ufffd'.repeat(p + 2);
}
}
}
}
// Attempts to complete a multi-byte UTF-8 character using bytes from a Buffer.
function utf8FillLast(buf) {
var p = this.lastTotal - this.lastNeed;
var r = utf8CheckExtraBytes(this, buf, p);
if (r !== undefined) return r;
if (this.lastNeed <= buf.length) {
buf.copy(this.lastChar, p, 0, this.lastNeed);
return this.lastChar.toString(this.encoding, 0, this.lastTotal);
}
buf.copy(this.lastChar, p, 0, buf.length);
this.lastNeed -= buf.length;
}
// Returns all complete UTF-8 characters in a Buffer. If the Buffer ended on a
// partial character, the character's bytes are buffered until the required
// number of bytes are available.
function utf8Text(buf, i) {
var total = utf8CheckIncomplete(this, buf, i);
if (!this.lastNeed) return buf.toString('utf8', i);
this.lastTotal = total;
var end = buf.length - (total - this.lastNeed);
buf.copy(this.lastChar, 0, end);
return buf.toString('utf8', i, end);
}
// For UTF-8, a replacement character for each buffered byte of a (partial)
// character needs to be added to the output.
function utf8End(buf) {
var r = buf && buf.length ? this.write(buf) : '';
if (this.lastNeed) return r + '\ufffd'.repeat(this.lastTotal - this.lastNeed);
return r;
}
// UTF-16LE typically needs two bytes per character, but even if we have an even
// number of bytes available, we need to check if we end on a leading/high
// surrogate. In that case, we need to wait for the next two bytes in order to
// decode the last character properly.
function utf16Text(buf, i) {
if ((buf.length - i) % 2 === 0) {
var r = buf.toString('utf16le', i);
if (r) {
var c = r.charCodeAt(r.length - 1);
if (c >= 0xD800 && c <= 0xDBFF) {
this.lastNeed = 2;
this.lastTotal = 4;
this.lastChar[0] = buf[buf.length - 2];
this.lastChar[1] = buf[buf.length - 1];
return r.slice(0, -1);
}
}
return r;
}
this.lastNeed = 1;
this.lastTotal = 2;
this.lastChar[0] = buf[buf.length - 1];
return buf.toString('utf16le', i, buf.length - 1);
}
// For UTF-16LE we do not explicitly append special replacement characters if we
// end on a partial character, we simply let v8 handle that.
function utf16End(buf) {
var r = buf && buf.length ? this.write(buf) : '';
if (this.lastNeed) {
var end = this.lastTotal - this.lastNeed;
return r + this.lastChar.toString('utf16le', 0, end);
}
return r;
}
function base64Text(buf, i) {
var n = (buf.length - i) % 3;
if (n === 0) return buf.toString('base64', i);
this.lastNeed = 3 - n;
this.lastTotal = 3;
if (n === 1) {
this.lastChar[0] = buf[buf.length - 1];
} else {
this.lastChar[0] = buf[buf.length - 2];
this.lastChar[1] = buf[buf.length - 1];
}
return buf.toString('base64', i, buf.length - n);
}
function base64End(buf) {
var r = buf && buf.length ? this.write(buf) : '';
if (this.lastNeed) return r + this.lastChar.toString('base64', 0, 3 - this.lastNeed);
return r;
}
// Pass bytes on through for single-byte encodings (e.g. ascii, latin1, hex)
function simpleWrite(buf) {
return buf.toString(this.encoding);
}
function simpleEnd(buf) {
return buf && buf.length ? this.write(buf) : '';
}

View file

@ -0,0 +1,56 @@
{
"_from": "string_decoder@~1.0.0",
"_id": "string_decoder@1.0.3",
"_inBundle": false,
"_integrity": "sha512-4AH6Z5fzNNBcH+6XDMfA/BTt87skxqJlO0lAh3Dker5zThcAxG6mKz+iGu308UKoPPQ8Dcqx/4JhujzltRa+hQ==",
"_location": "/readable-stream/string_decoder",
"_phantomChildren": {},
"_requested": {
"type": "range",
"registry": true,
"raw": "string_decoder@~1.0.0",
"name": "string_decoder",
"escapedName": "string_decoder",
"rawSpec": "~1.0.0",
"saveSpec": null,
"fetchSpec": "~1.0.0"
},
"_requiredBy": [
"/readable-stream"
],
"_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-1.0.3.tgz",
"_shasum": "0fc67d7c141825de94282dd536bec6b9bce860ab",
"_spec": "string_decoder@~1.0.0",
"_where": "/Users/sclay/projects/newsblur/node/node_modules/readable-stream",
"bugs": {
"url": "https://github.com/rvagg/string_decoder/issues"
},
"bundleDependencies": false,
"dependencies": {
"safe-buffer": "~5.1.0"
},
"deprecated": false,
"description": "The string_decoder module from Node core",
"devDependencies": {
"babel-polyfill": "^6.23.0",
"tap": "~0.4.8"
},
"homepage": "https://github.com/rvagg/string_decoder",
"keywords": [
"string",
"decoder",
"browser",
"browserify"
],
"license": "MIT",
"main": "lib/string_decoder.js",
"name": "string_decoder",
"repository": {
"type": "git",
"url": "git://github.com/rvagg/string_decoder.git"
},
"scripts": {
"test": "tap test/parallel/*.js && node test/verify-dependencies"
},
"version": "1.0.3"
}

Some files were not shown because too many files have changed in this diff Show more