Initial commit
This commit is contained in:
21
node_modules/pg/LICENSE
generated
vendored
Normal file
21
node_modules/pg/LICENSE
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2010 - 2021 Brian Carlson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
89
node_modules/pg/README.md
generated
vendored
Normal file
89
node_modules/pg/README.md
generated
vendored
Normal file
@ -0,0 +1,89 @@
|
||||
# node-postgres
|
||||
|
||||
[](http://travis-ci.org/brianc/node-postgres)
|
||||
<span class="badge-npmversion"><a href="https://npmjs.org/package/pg" title="View this project on NPM"><img src="https://img.shields.io/npm/v/pg.svg" alt="NPM version" /></a></span>
|
||||
<span class="badge-npmdownloads"><a href="https://npmjs.org/package/pg" title="View this project on NPM"><img src="https://img.shields.io/npm/dm/pg.svg" alt="NPM downloads" /></a></span>
|
||||
|
||||
Non-blocking PostgreSQL client for Node.js. Pure JavaScript and optional native libpq bindings.
|
||||
|
||||
## Install
|
||||
|
||||
```sh
|
||||
$ npm install pg
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## :star: [Documentation](https://node-postgres.com) :star:
|
||||
|
||||
### Features
|
||||
|
||||
- Pure JavaScript client and native libpq bindings share _the same API_
|
||||
- Connection pooling
|
||||
- Extensible JS ↔ PostgreSQL data-type coercion
|
||||
- Supported PostgreSQL features
|
||||
- Parameterized queries
|
||||
- Named statements with query plan caching
|
||||
- Async notifications with `LISTEN/NOTIFY`
|
||||
- Bulk import & export with `COPY TO/COPY FROM`
|
||||
|
||||
### Extras
|
||||
|
||||
node-postgres is by design pretty light on abstractions. These are some handy modules we've been using over the years to complete the picture.
|
||||
The entire list can be found on our [wiki](https://github.com/brianc/node-postgres/wiki/Extras).
|
||||
|
||||
## Support
|
||||
|
||||
node-postgres is free software. If you encounter a bug with the library please open an issue on the [GitHub repo](https://github.com/brianc/node-postgres). If you have questions unanswered by the documentation please open an issue pointing out how the documentation was unclear & I will do my best to make it better!
|
||||
|
||||
When you open an issue please provide:
|
||||
|
||||
- version of Node
|
||||
- version of Postgres
|
||||
- smallest possible snippet of code to reproduce the problem
|
||||
|
||||
You can also follow me [@briancarlson](https://twitter.com/briancarlson) if that's your thing. I try to always announce noteworthy changes & developments with node-postgres on Twitter.
|
||||
|
||||
## Sponsorship :two_hearts:
|
||||
|
||||
node-postgres's continued development has been made possible in part by generous finanical support from [the community](https://github.com/brianc/node-postgres/blob/master/SPONSORS.md).
|
||||
|
||||
If you or your company are benefiting from node-postgres and would like to help keep the project financially sustainable [please consider supporting](https://github.com/sponsors/brianc) its development.
|
||||
|
||||
## Contributing
|
||||
|
||||
**:heart: contributions!**
|
||||
|
||||
I will **happily** accept your pull request if it:
|
||||
|
||||
- **has tests**
|
||||
- looks reasonable
|
||||
- does not break backwards compatibility
|
||||
|
||||
If your change involves breaking backwards compatibility please please point that out in the pull request & we can discuss & plan when and how to release it and what type of documentation or communicate it will require.
|
||||
|
||||
## Troubleshooting and FAQ
|
||||
|
||||
The causes and solutions to common errors can be found among the [Frequently Asked Questions (FAQ)](https://github.com/brianc/node-postgres/wiki/FAQ)
|
||||
|
||||
## License
|
||||
|
||||
Copyright (c) 2010-2020 Brian Carlson (brian.m.carlson@gmail.com)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
640
node_modules/pg/lib/client.js
generated
vendored
Normal file
640
node_modules/pg/lib/client.js
generated
vendored
Normal file
@ -0,0 +1,640 @@
|
||||
'use strict'
|
||||
|
||||
var EventEmitter = require('events').EventEmitter
|
||||
var utils = require('./utils')
|
||||
var sasl = require('./crypto/sasl')
|
||||
var TypeOverrides = require('./type-overrides')
|
||||
|
||||
var ConnectionParameters = require('./connection-parameters')
|
||||
var Query = require('./query')
|
||||
var defaults = require('./defaults')
|
||||
var Connection = require('./connection')
|
||||
const crypto = require('./crypto/utils')
|
||||
|
||||
class Client extends EventEmitter {
|
||||
constructor(config) {
|
||||
super()
|
||||
|
||||
this.connectionParameters = new ConnectionParameters(config)
|
||||
this.user = this.connectionParameters.user
|
||||
this.database = this.connectionParameters.database
|
||||
this.port = this.connectionParameters.port
|
||||
this.host = this.connectionParameters.host
|
||||
|
||||
// "hiding" the password so it doesn't show up in stack traces
|
||||
// or if the client is console.logged
|
||||
Object.defineProperty(this, 'password', {
|
||||
configurable: true,
|
||||
enumerable: false,
|
||||
writable: true,
|
||||
value: this.connectionParameters.password,
|
||||
})
|
||||
|
||||
this.replication = this.connectionParameters.replication
|
||||
|
||||
var c = config || {}
|
||||
|
||||
this._Promise = c.Promise || global.Promise
|
||||
this._types = new TypeOverrides(c.types)
|
||||
this._ending = false
|
||||
this._ended = false
|
||||
this._connecting = false
|
||||
this._connected = false
|
||||
this._connectionError = false
|
||||
this._queryable = true
|
||||
|
||||
this.connection =
|
||||
c.connection ||
|
||||
new Connection({
|
||||
stream: c.stream,
|
||||
ssl: this.connectionParameters.ssl,
|
||||
keepAlive: c.keepAlive || false,
|
||||
keepAliveInitialDelayMillis: c.keepAliveInitialDelayMillis || 0,
|
||||
encoding: this.connectionParameters.client_encoding || 'utf8',
|
||||
})
|
||||
this.queryQueue = []
|
||||
this.binary = c.binary || defaults.binary
|
||||
this.processID = null
|
||||
this.secretKey = null
|
||||
this.ssl = this.connectionParameters.ssl || false
|
||||
// As with Password, make SSL->Key (the private key) non-enumerable.
|
||||
// It won't show up in stack traces
|
||||
// or if the client is console.logged
|
||||
if (this.ssl && this.ssl.key) {
|
||||
Object.defineProperty(this.ssl, 'key', {
|
||||
enumerable: false,
|
||||
})
|
||||
}
|
||||
|
||||
this._connectionTimeoutMillis = c.connectionTimeoutMillis || 0
|
||||
}
|
||||
|
||||
_errorAllQueries(err) {
|
||||
const enqueueError = (query) => {
|
||||
process.nextTick(() => {
|
||||
query.handleError(err, this.connection)
|
||||
})
|
||||
}
|
||||
|
||||
if (this.activeQuery) {
|
||||
enqueueError(this.activeQuery)
|
||||
this.activeQuery = null
|
||||
}
|
||||
|
||||
this.queryQueue.forEach(enqueueError)
|
||||
this.queryQueue.length = 0
|
||||
}
|
||||
|
||||
_connect(callback) {
|
||||
var self = this
|
||||
var con = this.connection
|
||||
this._connectionCallback = callback
|
||||
|
||||
if (this._connecting || this._connected) {
|
||||
const err = new Error('Client has already been connected. You cannot reuse a client.')
|
||||
process.nextTick(() => {
|
||||
callback(err)
|
||||
})
|
||||
return
|
||||
}
|
||||
this._connecting = true
|
||||
|
||||
if (this._connectionTimeoutMillis > 0) {
|
||||
this.connectionTimeoutHandle = setTimeout(() => {
|
||||
con._ending = true
|
||||
con.stream.destroy(new Error('timeout expired'))
|
||||
}, this._connectionTimeoutMillis)
|
||||
}
|
||||
|
||||
if (this.host && this.host.indexOf('/') === 0) {
|
||||
con.connect(this.host + '/.s.PGSQL.' + this.port)
|
||||
} else {
|
||||
con.connect(this.port, this.host)
|
||||
}
|
||||
|
||||
// once connection is established send startup message
|
||||
con.on('connect', function () {
|
||||
if (self.ssl) {
|
||||
con.requestSsl()
|
||||
} else {
|
||||
con.startup(self.getStartupConf())
|
||||
}
|
||||
})
|
||||
|
||||
con.on('sslconnect', function () {
|
||||
con.startup(self.getStartupConf())
|
||||
})
|
||||
|
||||
this._attachListeners(con)
|
||||
|
||||
con.once('end', () => {
|
||||
const error = this._ending ? new Error('Connection terminated') : new Error('Connection terminated unexpectedly')
|
||||
|
||||
clearTimeout(this.connectionTimeoutHandle)
|
||||
this._errorAllQueries(error)
|
||||
this._ended = true
|
||||
|
||||
if (!this._ending) {
|
||||
// if the connection is ended without us calling .end()
|
||||
// on this client then we have an unexpected disconnection
|
||||
// treat this as an error unless we've already emitted an error
|
||||
// during connection.
|
||||
if (this._connecting && !this._connectionError) {
|
||||
if (this._connectionCallback) {
|
||||
this._connectionCallback(error)
|
||||
} else {
|
||||
this._handleErrorEvent(error)
|
||||
}
|
||||
} else if (!this._connectionError) {
|
||||
this._handleErrorEvent(error)
|
||||
}
|
||||
}
|
||||
|
||||
process.nextTick(() => {
|
||||
this.emit('end')
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
connect(callback) {
|
||||
if (callback) {
|
||||
this._connect(callback)
|
||||
return
|
||||
}
|
||||
|
||||
return new this._Promise((resolve, reject) => {
|
||||
this._connect((error) => {
|
||||
if (error) {
|
||||
reject(error)
|
||||
} else {
|
||||
resolve()
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
_attachListeners(con) {
|
||||
// password request handling
|
||||
con.on('authenticationCleartextPassword', this._handleAuthCleartextPassword.bind(this))
|
||||
// password request handling
|
||||
con.on('authenticationMD5Password', this._handleAuthMD5Password.bind(this))
|
||||
// password request handling (SASL)
|
||||
con.on('authenticationSASL', this._handleAuthSASL.bind(this))
|
||||
con.on('authenticationSASLContinue', this._handleAuthSASLContinue.bind(this))
|
||||
con.on('authenticationSASLFinal', this._handleAuthSASLFinal.bind(this))
|
||||
con.on('backendKeyData', this._handleBackendKeyData.bind(this))
|
||||
con.on('error', this._handleErrorEvent.bind(this))
|
||||
con.on('errorMessage', this._handleErrorMessage.bind(this))
|
||||
con.on('readyForQuery', this._handleReadyForQuery.bind(this))
|
||||
con.on('notice', this._handleNotice.bind(this))
|
||||
con.on('rowDescription', this._handleRowDescription.bind(this))
|
||||
con.on('dataRow', this._handleDataRow.bind(this))
|
||||
con.on('portalSuspended', this._handlePortalSuspended.bind(this))
|
||||
con.on('emptyQuery', this._handleEmptyQuery.bind(this))
|
||||
con.on('commandComplete', this._handleCommandComplete.bind(this))
|
||||
con.on('parseComplete', this._handleParseComplete.bind(this))
|
||||
con.on('copyInResponse', this._handleCopyInResponse.bind(this))
|
||||
con.on('copyData', this._handleCopyData.bind(this))
|
||||
con.on('notification', this._handleNotification.bind(this))
|
||||
}
|
||||
|
||||
// TODO(bmc): deprecate pgpass "built in" integration since this.password can be a function
|
||||
// it can be supplied by the user if required - this is a breaking change!
|
||||
_checkPgPass(cb) {
|
||||
const con = this.connection
|
||||
if (typeof this.password === 'function') {
|
||||
this._Promise
|
||||
.resolve()
|
||||
.then(() => this.password())
|
||||
.then((pass) => {
|
||||
if (pass !== undefined) {
|
||||
if (typeof pass !== 'string') {
|
||||
con.emit('error', new TypeError('Password must be a string'))
|
||||
return
|
||||
}
|
||||
this.connectionParameters.password = this.password = pass
|
||||
} else {
|
||||
this.connectionParameters.password = this.password = null
|
||||
}
|
||||
cb()
|
||||
})
|
||||
.catch((err) => {
|
||||
con.emit('error', err)
|
||||
})
|
||||
} else if (this.password !== null) {
|
||||
cb()
|
||||
} else {
|
||||
try {
|
||||
const pgPass = require('pgpass')
|
||||
pgPass(this.connectionParameters, (pass) => {
|
||||
if (undefined !== pass) {
|
||||
this.connectionParameters.password = this.password = pass
|
||||
}
|
||||
cb()
|
||||
})
|
||||
} catch (e) {
|
||||
this.emit('error', e)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_handleAuthCleartextPassword(msg) {
|
||||
this._checkPgPass(() => {
|
||||
this.connection.password(this.password)
|
||||
})
|
||||
}
|
||||
|
||||
_handleAuthMD5Password(msg) {
|
||||
this._checkPgPass(async () => {
|
||||
try {
|
||||
const hashedPassword = await crypto.postgresMd5PasswordHash(this.user, this.password, msg.salt)
|
||||
this.connection.password(hashedPassword)
|
||||
} catch (e) {
|
||||
this.emit('error', e)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
_handleAuthSASL(msg) {
|
||||
this._checkPgPass(() => {
|
||||
try {
|
||||
this.saslSession = sasl.startSession(msg.mechanisms)
|
||||
this.connection.sendSASLInitialResponseMessage(this.saslSession.mechanism, this.saslSession.response)
|
||||
} catch (err) {
|
||||
this.connection.emit('error', err)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
async _handleAuthSASLContinue(msg) {
|
||||
try {
|
||||
await sasl.continueSession(this.saslSession, this.password, msg.data)
|
||||
this.connection.sendSCRAMClientFinalMessage(this.saslSession.response)
|
||||
} catch (err) {
|
||||
this.connection.emit('error', err)
|
||||
}
|
||||
}
|
||||
|
||||
_handleAuthSASLFinal(msg) {
|
||||
try {
|
||||
sasl.finalizeSession(this.saslSession, msg.data)
|
||||
this.saslSession = null
|
||||
} catch (err) {
|
||||
this.connection.emit('error', err)
|
||||
}
|
||||
}
|
||||
|
||||
_handleBackendKeyData(msg) {
|
||||
this.processID = msg.processID
|
||||
this.secretKey = msg.secretKey
|
||||
}
|
||||
|
||||
_handleReadyForQuery(msg) {
|
||||
if (this._connecting) {
|
||||
this._connecting = false
|
||||
this._connected = true
|
||||
clearTimeout(this.connectionTimeoutHandle)
|
||||
|
||||
// process possible callback argument to Client#connect
|
||||
if (this._connectionCallback) {
|
||||
this._connectionCallback(null, this)
|
||||
// remove callback for proper error handling
|
||||
// after the connect event
|
||||
this._connectionCallback = null
|
||||
}
|
||||
this.emit('connect')
|
||||
}
|
||||
const { activeQuery } = this
|
||||
this.activeQuery = null
|
||||
this.readyForQuery = true
|
||||
if (activeQuery) {
|
||||
activeQuery.handleReadyForQuery(this.connection)
|
||||
}
|
||||
this._pulseQueryQueue()
|
||||
}
|
||||
|
||||
// if we receieve an error event or error message
|
||||
// during the connection process we handle it here
|
||||
_handleErrorWhileConnecting(err) {
|
||||
if (this._connectionError) {
|
||||
// TODO(bmc): this is swallowing errors - we shouldn't do this
|
||||
return
|
||||
}
|
||||
this._connectionError = true
|
||||
clearTimeout(this.connectionTimeoutHandle)
|
||||
if (this._connectionCallback) {
|
||||
return this._connectionCallback(err)
|
||||
}
|
||||
this.emit('error', err)
|
||||
}
|
||||
|
||||
// if we're connected and we receive an error event from the connection
|
||||
// this means the socket is dead - do a hard abort of all queries and emit
|
||||
// the socket error on the client as well
|
||||
_handleErrorEvent(err) {
|
||||
if (this._connecting) {
|
||||
return this._handleErrorWhileConnecting(err)
|
||||
}
|
||||
this._queryable = false
|
||||
this._errorAllQueries(err)
|
||||
this.emit('error', err)
|
||||
}
|
||||
|
||||
// handle error messages from the postgres backend
|
||||
_handleErrorMessage(msg) {
|
||||
if (this._connecting) {
|
||||
return this._handleErrorWhileConnecting(msg)
|
||||
}
|
||||
const activeQuery = this.activeQuery
|
||||
|
||||
if (!activeQuery) {
|
||||
this._handleErrorEvent(msg)
|
||||
return
|
||||
}
|
||||
|
||||
this.activeQuery = null
|
||||
activeQuery.handleError(msg, this.connection)
|
||||
}
|
||||
|
||||
_handleRowDescription(msg) {
|
||||
// delegate rowDescription to active query
|
||||
this.activeQuery.handleRowDescription(msg)
|
||||
}
|
||||
|
||||
_handleDataRow(msg) {
|
||||
// delegate dataRow to active query
|
||||
this.activeQuery.handleDataRow(msg)
|
||||
}
|
||||
|
||||
_handlePortalSuspended(msg) {
|
||||
// delegate portalSuspended to active query
|
||||
this.activeQuery.handlePortalSuspended(this.connection)
|
||||
}
|
||||
|
||||
_handleEmptyQuery(msg) {
|
||||
// delegate emptyQuery to active query
|
||||
this.activeQuery.handleEmptyQuery(this.connection)
|
||||
}
|
||||
|
||||
_handleCommandComplete(msg) {
|
||||
if (this.activeQuery == null) {
|
||||
const error = new Error('Received unexpected commandComplete message from backend.')
|
||||
this._handleErrorEvent(error)
|
||||
return
|
||||
}
|
||||
// delegate commandComplete to active query
|
||||
this.activeQuery.handleCommandComplete(msg, this.connection)
|
||||
}
|
||||
|
||||
_handleParseComplete() {
|
||||
if (this.activeQuery == null) {
|
||||
const error = new Error('Received unexpected parseComplete message from backend.')
|
||||
this._handleErrorEvent(error)
|
||||
return
|
||||
}
|
||||
// if a prepared statement has a name and properly parses
|
||||
// we track that its already been executed so we don't parse
|
||||
// it again on the same client
|
||||
if (this.activeQuery.name) {
|
||||
this.connection.parsedStatements[this.activeQuery.name] = this.activeQuery.text
|
||||
}
|
||||
}
|
||||
|
||||
_handleCopyInResponse(msg) {
|
||||
this.activeQuery.handleCopyInResponse(this.connection)
|
||||
}
|
||||
|
||||
_handleCopyData(msg) {
|
||||
this.activeQuery.handleCopyData(msg, this.connection)
|
||||
}
|
||||
|
||||
_handleNotification(msg) {
|
||||
this.emit('notification', msg)
|
||||
}
|
||||
|
||||
_handleNotice(msg) {
|
||||
this.emit('notice', msg)
|
||||
}
|
||||
|
||||
getStartupConf() {
|
||||
var params = this.connectionParameters
|
||||
|
||||
var data = {
|
||||
user: params.user,
|
||||
database: params.database,
|
||||
}
|
||||
|
||||
var appName = params.application_name || params.fallback_application_name
|
||||
if (appName) {
|
||||
data.application_name = appName
|
||||
}
|
||||
if (params.replication) {
|
||||
data.replication = '' + params.replication
|
||||
}
|
||||
if (params.statement_timeout) {
|
||||
data.statement_timeout = String(parseInt(params.statement_timeout, 10))
|
||||
}
|
||||
if (params.lock_timeout) {
|
||||
data.lock_timeout = String(parseInt(params.lock_timeout, 10))
|
||||
}
|
||||
if (params.idle_in_transaction_session_timeout) {
|
||||
data.idle_in_transaction_session_timeout = String(parseInt(params.idle_in_transaction_session_timeout, 10))
|
||||
}
|
||||
if (params.options) {
|
||||
data.options = params.options
|
||||
}
|
||||
|
||||
return data
|
||||
}
|
||||
|
||||
cancel(client, query) {
|
||||
if (client.activeQuery === query) {
|
||||
var con = this.connection
|
||||
|
||||
if (this.host && this.host.indexOf('/') === 0) {
|
||||
con.connect(this.host + '/.s.PGSQL.' + this.port)
|
||||
} else {
|
||||
con.connect(this.port, this.host)
|
||||
}
|
||||
|
||||
// once connection is established send cancel message
|
||||
con.on('connect', function () {
|
||||
con.cancel(client.processID, client.secretKey)
|
||||
})
|
||||
} else if (client.queryQueue.indexOf(query) !== -1) {
|
||||
client.queryQueue.splice(client.queryQueue.indexOf(query), 1)
|
||||
}
|
||||
}
|
||||
|
||||
setTypeParser(oid, format, parseFn) {
|
||||
return this._types.setTypeParser(oid, format, parseFn)
|
||||
}
|
||||
|
||||
getTypeParser(oid, format) {
|
||||
return this._types.getTypeParser(oid, format)
|
||||
}
|
||||
|
||||
// escapeIdentifier and escapeLiteral moved to utility functions & exported
|
||||
// on PG
|
||||
// re-exported here for backwards compatibility
|
||||
escapeIdentifier(str) {
|
||||
return utils.escapeIdentifier(str)
|
||||
}
|
||||
|
||||
escapeLiteral(str) {
|
||||
return utils.escapeLiteral(str)
|
||||
}
|
||||
|
||||
_pulseQueryQueue() {
|
||||
if (this.readyForQuery === true) {
|
||||
this.activeQuery = this.queryQueue.shift()
|
||||
if (this.activeQuery) {
|
||||
this.readyForQuery = false
|
||||
this.hasExecuted = true
|
||||
|
||||
const queryError = this.activeQuery.submit(this.connection)
|
||||
if (queryError) {
|
||||
process.nextTick(() => {
|
||||
this.activeQuery.handleError(queryError, this.connection)
|
||||
this.readyForQuery = true
|
||||
this._pulseQueryQueue()
|
||||
})
|
||||
}
|
||||
} else if (this.hasExecuted) {
|
||||
this.activeQuery = null
|
||||
this.emit('drain')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
query(config, values, callback) {
|
||||
// can take in strings, config object or query object
|
||||
var query
|
||||
var result
|
||||
var readTimeout
|
||||
var readTimeoutTimer
|
||||
var queryCallback
|
||||
|
||||
if (config === null || config === undefined) {
|
||||
throw new TypeError('Client was passed a null or undefined query')
|
||||
} else if (typeof config.submit === 'function') {
|
||||
readTimeout = config.query_timeout || this.connectionParameters.query_timeout
|
||||
result = query = config
|
||||
if (typeof values === 'function') {
|
||||
query.callback = query.callback || values
|
||||
}
|
||||
} else {
|
||||
readTimeout = config.query_timeout || this.connectionParameters.query_timeout
|
||||
query = new Query(config, values, callback)
|
||||
if (!query.callback) {
|
||||
result = new this._Promise((resolve, reject) => {
|
||||
query.callback = (err, res) => (err ? reject(err) : resolve(res))
|
||||
}).catch((err) => {
|
||||
// replace the stack trace that leads to `TCP.onStreamRead` with one that leads back to the
|
||||
// application that created the query
|
||||
Error.captureStackTrace(err)
|
||||
throw err
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
if (readTimeout) {
|
||||
queryCallback = query.callback
|
||||
|
||||
readTimeoutTimer = setTimeout(() => {
|
||||
var error = new Error('Query read timeout')
|
||||
|
||||
process.nextTick(() => {
|
||||
query.handleError(error, this.connection)
|
||||
})
|
||||
|
||||
queryCallback(error)
|
||||
|
||||
// we already returned an error,
|
||||
// just do nothing if query completes
|
||||
query.callback = () => {}
|
||||
|
||||
// Remove from queue
|
||||
var index = this.queryQueue.indexOf(query)
|
||||
if (index > -1) {
|
||||
this.queryQueue.splice(index, 1)
|
||||
}
|
||||
|
||||
this._pulseQueryQueue()
|
||||
}, readTimeout)
|
||||
|
||||
query.callback = (err, res) => {
|
||||
clearTimeout(readTimeoutTimer)
|
||||
queryCallback(err, res)
|
||||
}
|
||||
}
|
||||
|
||||
if (this.binary && !query.binary) {
|
||||
query.binary = true
|
||||
}
|
||||
|
||||
if (query._result && !query._result._types) {
|
||||
query._result._types = this._types
|
||||
}
|
||||
|
||||
if (!this._queryable) {
|
||||
process.nextTick(() => {
|
||||
query.handleError(new Error('Client has encountered a connection error and is not queryable'), this.connection)
|
||||
})
|
||||
return result
|
||||
}
|
||||
|
||||
if (this._ending) {
|
||||
process.nextTick(() => {
|
||||
query.handleError(new Error('Client was closed and is not queryable'), this.connection)
|
||||
})
|
||||
return result
|
||||
}
|
||||
|
||||
this.queryQueue.push(query)
|
||||
this._pulseQueryQueue()
|
||||
return result
|
||||
}
|
||||
|
||||
ref() {
|
||||
this.connection.ref()
|
||||
}
|
||||
|
||||
unref() {
|
||||
this.connection.unref()
|
||||
}
|
||||
|
||||
end(cb) {
|
||||
this._ending = true
|
||||
|
||||
// if we have never connected, then end is a noop, callback immediately
|
||||
if (!this.connection._connecting || this._ended) {
|
||||
if (cb) {
|
||||
cb()
|
||||
} else {
|
||||
return this._Promise.resolve()
|
||||
}
|
||||
}
|
||||
|
||||
if (this.activeQuery || !this._queryable) {
|
||||
// if we have an active query we need to force a disconnect
|
||||
// on the socket - otherwise a hung query could block end forever
|
||||
this.connection.stream.destroy()
|
||||
} else {
|
||||
this.connection.end()
|
||||
}
|
||||
|
||||
if (cb) {
|
||||
this.connection.once('end', cb)
|
||||
} else {
|
||||
return new this._Promise((resolve) => {
|
||||
this.connection.once('end', resolve)
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// expose a Query constructor
|
||||
Client.Query = Query
|
||||
|
||||
module.exports = Client
|
||||
167
node_modules/pg/lib/connection-parameters.js
generated
vendored
Normal file
167
node_modules/pg/lib/connection-parameters.js
generated
vendored
Normal file
@ -0,0 +1,167 @@
|
||||
'use strict'
|
||||
|
||||
var dns = require('dns')
|
||||
|
||||
var defaults = require('./defaults')
|
||||
|
||||
var parse = require('pg-connection-string').parse // parses a connection string
|
||||
|
||||
var val = function (key, config, envVar) {
|
||||
if (envVar === undefined) {
|
||||
envVar = process.env['PG' + key.toUpperCase()]
|
||||
} else if (envVar === false) {
|
||||
// do nothing ... use false
|
||||
} else {
|
||||
envVar = process.env[envVar]
|
||||
}
|
||||
|
||||
return config[key] || envVar || defaults[key]
|
||||
}
|
||||
|
||||
var readSSLConfigFromEnvironment = function () {
|
||||
switch (process.env.PGSSLMODE) {
|
||||
case 'disable':
|
||||
return false
|
||||
case 'prefer':
|
||||
case 'require':
|
||||
case 'verify-ca':
|
||||
case 'verify-full':
|
||||
return true
|
||||
case 'no-verify':
|
||||
return { rejectUnauthorized: false }
|
||||
}
|
||||
return defaults.ssl
|
||||
}
|
||||
|
||||
// Convert arg to a string, surround in single quotes, and escape single quotes and backslashes
|
||||
var quoteParamValue = function (value) {
|
||||
return "'" + ('' + value).replace(/\\/g, '\\\\').replace(/'/g, "\\'") + "'"
|
||||
}
|
||||
|
||||
var add = function (params, config, paramName) {
|
||||
var value = config[paramName]
|
||||
if (value !== undefined && value !== null) {
|
||||
params.push(paramName + '=' + quoteParamValue(value))
|
||||
}
|
||||
}
|
||||
|
||||
class ConnectionParameters {
|
||||
constructor(config) {
|
||||
// if a string is passed, it is a raw connection string so we parse it into a config
|
||||
config = typeof config === 'string' ? parse(config) : config || {}
|
||||
|
||||
// if the config has a connectionString defined, parse IT into the config we use
|
||||
// this will override other default values with what is stored in connectionString
|
||||
if (config.connectionString) {
|
||||
config = Object.assign({}, config, parse(config.connectionString))
|
||||
}
|
||||
|
||||
this.user = val('user', config)
|
||||
this.database = val('database', config)
|
||||
|
||||
if (this.database === undefined) {
|
||||
this.database = this.user
|
||||
}
|
||||
|
||||
this.port = parseInt(val('port', config), 10)
|
||||
this.host = val('host', config)
|
||||
|
||||
// "hiding" the password so it doesn't show up in stack traces
|
||||
// or if the client is console.logged
|
||||
Object.defineProperty(this, 'password', {
|
||||
configurable: true,
|
||||
enumerable: false,
|
||||
writable: true,
|
||||
value: val('password', config),
|
||||
})
|
||||
|
||||
this.binary = val('binary', config)
|
||||
this.options = val('options', config)
|
||||
|
||||
this.ssl = typeof config.ssl === 'undefined' ? readSSLConfigFromEnvironment() : config.ssl
|
||||
|
||||
if (typeof this.ssl === 'string') {
|
||||
if (this.ssl === 'true') {
|
||||
this.ssl = true
|
||||
}
|
||||
}
|
||||
// support passing in ssl=no-verify via connection string
|
||||
if (this.ssl === 'no-verify') {
|
||||
this.ssl = { rejectUnauthorized: false }
|
||||
}
|
||||
if (this.ssl && this.ssl.key) {
|
||||
Object.defineProperty(this.ssl, 'key', {
|
||||
enumerable: false,
|
||||
})
|
||||
}
|
||||
|
||||
this.client_encoding = val('client_encoding', config)
|
||||
this.replication = val('replication', config)
|
||||
// a domain socket begins with '/'
|
||||
this.isDomainSocket = !(this.host || '').indexOf('/')
|
||||
|
||||
this.application_name = val('application_name', config, 'PGAPPNAME')
|
||||
this.fallback_application_name = val('fallback_application_name', config, false)
|
||||
this.statement_timeout = val('statement_timeout', config, false)
|
||||
this.lock_timeout = val('lock_timeout', config, false)
|
||||
this.idle_in_transaction_session_timeout = val('idle_in_transaction_session_timeout', config, false)
|
||||
this.query_timeout = val('query_timeout', config, false)
|
||||
|
||||
if (config.connectionTimeoutMillis === undefined) {
|
||||
this.connect_timeout = process.env.PGCONNECT_TIMEOUT || 0
|
||||
} else {
|
||||
this.connect_timeout = Math.floor(config.connectionTimeoutMillis / 1000)
|
||||
}
|
||||
|
||||
if (config.keepAlive === false) {
|
||||
this.keepalives = 0
|
||||
} else if (config.keepAlive === true) {
|
||||
this.keepalives = 1
|
||||
}
|
||||
|
||||
if (typeof config.keepAliveInitialDelayMillis === 'number') {
|
||||
this.keepalives_idle = Math.floor(config.keepAliveInitialDelayMillis / 1000)
|
||||
}
|
||||
}
|
||||
|
||||
getLibpqConnectionString(cb) {
|
||||
var params = []
|
||||
add(params, this, 'user')
|
||||
add(params, this, 'password')
|
||||
add(params, this, 'port')
|
||||
add(params, this, 'application_name')
|
||||
add(params, this, 'fallback_application_name')
|
||||
add(params, this, 'connect_timeout')
|
||||
add(params, this, 'options')
|
||||
|
||||
var ssl = typeof this.ssl === 'object' ? this.ssl : this.ssl ? { sslmode: this.ssl } : {}
|
||||
add(params, ssl, 'sslmode')
|
||||
add(params, ssl, 'sslca')
|
||||
add(params, ssl, 'sslkey')
|
||||
add(params, ssl, 'sslcert')
|
||||
add(params, ssl, 'sslrootcert')
|
||||
|
||||
if (this.database) {
|
||||
params.push('dbname=' + quoteParamValue(this.database))
|
||||
}
|
||||
if (this.replication) {
|
||||
params.push('replication=' + quoteParamValue(this.replication))
|
||||
}
|
||||
if (this.host) {
|
||||
params.push('host=' + quoteParamValue(this.host))
|
||||
}
|
||||
if (this.isDomainSocket) {
|
||||
return cb(null, params.join(' '))
|
||||
}
|
||||
if (this.client_encoding) {
|
||||
params.push('client_encoding=' + quoteParamValue(this.client_encoding))
|
||||
}
|
||||
dns.lookup(this.host, function (err, address) {
|
||||
if (err) return cb(err, null)
|
||||
params.push('hostaddr=' + quoteParamValue(address))
|
||||
return cb(null, params.join(' '))
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = ConnectionParameters
|
||||
222
node_modules/pg/lib/connection.js
generated
vendored
Normal file
222
node_modules/pg/lib/connection.js
generated
vendored
Normal file
@ -0,0 +1,222 @@
|
||||
'use strict'
|
||||
|
||||
var EventEmitter = require('events').EventEmitter
|
||||
|
||||
const { parse, serialize } = require('pg-protocol')
|
||||
const { getStream, getSecureStream } = require('./stream')
|
||||
|
||||
const flushBuffer = serialize.flush()
|
||||
const syncBuffer = serialize.sync()
|
||||
const endBuffer = serialize.end()
|
||||
|
||||
// TODO(bmc) support binary mode at some point
|
||||
class Connection extends EventEmitter {
|
||||
constructor(config) {
|
||||
super()
|
||||
config = config || {}
|
||||
|
||||
this.stream = config.stream || getStream(config.ssl)
|
||||
if (typeof this.stream === 'function') {
|
||||
this.stream = this.stream(config)
|
||||
}
|
||||
|
||||
this._keepAlive = config.keepAlive
|
||||
this._keepAliveInitialDelayMillis = config.keepAliveInitialDelayMillis
|
||||
this.lastBuffer = false
|
||||
this.parsedStatements = {}
|
||||
this.ssl = config.ssl || false
|
||||
this._ending = false
|
||||
this._emitMessage = false
|
||||
var self = this
|
||||
this.on('newListener', function (eventName) {
|
||||
if (eventName === 'message') {
|
||||
self._emitMessage = true
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
connect(port, host) {
|
||||
var self = this
|
||||
|
||||
this._connecting = true
|
||||
this.stream.setNoDelay(true)
|
||||
this.stream.connect(port, host)
|
||||
|
||||
this.stream.once('connect', function () {
|
||||
if (self._keepAlive) {
|
||||
self.stream.setKeepAlive(true, self._keepAliveInitialDelayMillis)
|
||||
}
|
||||
self.emit('connect')
|
||||
})
|
||||
|
||||
const reportStreamError = function (error) {
|
||||
// errors about disconnections should be ignored during disconnect
|
||||
if (self._ending && (error.code === 'ECONNRESET' || error.code === 'EPIPE')) {
|
||||
return
|
||||
}
|
||||
self.emit('error', error)
|
||||
}
|
||||
this.stream.on('error', reportStreamError)
|
||||
|
||||
this.stream.on('close', function () {
|
||||
self.emit('end')
|
||||
})
|
||||
|
||||
if (!this.ssl) {
|
||||
return this.attachListeners(this.stream)
|
||||
}
|
||||
|
||||
this.stream.once('data', function (buffer) {
|
||||
var responseCode = buffer.toString('utf8')
|
||||
switch (responseCode) {
|
||||
case 'S': // Server supports SSL connections, continue with a secure connection
|
||||
break
|
||||
case 'N': // Server does not support SSL connections
|
||||
self.stream.end()
|
||||
return self.emit('error', new Error('The server does not support SSL connections'))
|
||||
default:
|
||||
// Any other response byte, including 'E' (ErrorResponse) indicating a server error
|
||||
self.stream.end()
|
||||
return self.emit('error', new Error('There was an error establishing an SSL connection'))
|
||||
}
|
||||
const options = {
|
||||
socket: self.stream,
|
||||
}
|
||||
|
||||
if (self.ssl !== true) {
|
||||
Object.assign(options, self.ssl)
|
||||
|
||||
if ('key' in self.ssl) {
|
||||
options.key = self.ssl.key
|
||||
}
|
||||
}
|
||||
|
||||
var net = require('net')
|
||||
if (net.isIP && net.isIP(host) === 0) {
|
||||
options.servername = host
|
||||
}
|
||||
try {
|
||||
self.stream = getSecureStream(options)
|
||||
} catch (err) {
|
||||
return self.emit('error', err)
|
||||
}
|
||||
self.attachListeners(self.stream)
|
||||
self.stream.on('error', reportStreamError)
|
||||
|
||||
self.emit('sslconnect')
|
||||
})
|
||||
}
|
||||
|
||||
attachListeners(stream) {
|
||||
parse(stream, (msg) => {
|
||||
var eventName = msg.name === 'error' ? 'errorMessage' : msg.name
|
||||
if (this._emitMessage) {
|
||||
this.emit('message', msg)
|
||||
}
|
||||
this.emit(eventName, msg)
|
||||
})
|
||||
}
|
||||
|
||||
requestSsl() {
|
||||
this.stream.write(serialize.requestSsl())
|
||||
}
|
||||
|
||||
startup(config) {
|
||||
this.stream.write(serialize.startup(config))
|
||||
}
|
||||
|
||||
cancel(processID, secretKey) {
|
||||
this._send(serialize.cancel(processID, secretKey))
|
||||
}
|
||||
|
||||
password(password) {
|
||||
this._send(serialize.password(password))
|
||||
}
|
||||
|
||||
sendSASLInitialResponseMessage(mechanism, initialResponse) {
|
||||
this._send(serialize.sendSASLInitialResponseMessage(mechanism, initialResponse))
|
||||
}
|
||||
|
||||
sendSCRAMClientFinalMessage(additionalData) {
|
||||
this._send(serialize.sendSCRAMClientFinalMessage(additionalData))
|
||||
}
|
||||
|
||||
_send(buffer) {
|
||||
if (!this.stream.writable) {
|
||||
return false
|
||||
}
|
||||
return this.stream.write(buffer)
|
||||
}
|
||||
|
||||
query(text) {
|
||||
this._send(serialize.query(text))
|
||||
}
|
||||
|
||||
// send parse message
|
||||
parse(query) {
|
||||
this._send(serialize.parse(query))
|
||||
}
|
||||
|
||||
// send bind message
|
||||
bind(config) {
|
||||
this._send(serialize.bind(config))
|
||||
}
|
||||
|
||||
// send execute message
|
||||
execute(config) {
|
||||
this._send(serialize.execute(config))
|
||||
}
|
||||
|
||||
flush() {
|
||||
if (this.stream.writable) {
|
||||
this.stream.write(flushBuffer)
|
||||
}
|
||||
}
|
||||
|
||||
sync() {
|
||||
this._ending = true
|
||||
this._send(syncBuffer)
|
||||
}
|
||||
|
||||
ref() {
|
||||
this.stream.ref()
|
||||
}
|
||||
|
||||
unref() {
|
||||
this.stream.unref()
|
||||
}
|
||||
|
||||
end() {
|
||||
// 0x58 = 'X'
|
||||
this._ending = true
|
||||
if (!this._connecting || !this.stream.writable) {
|
||||
this.stream.end()
|
||||
return
|
||||
}
|
||||
return this.stream.write(endBuffer, () => {
|
||||
this.stream.end()
|
||||
})
|
||||
}
|
||||
|
||||
close(msg) {
|
||||
this._send(serialize.close(msg))
|
||||
}
|
||||
|
||||
describe(msg) {
|
||||
this._send(serialize.describe(msg))
|
||||
}
|
||||
|
||||
sendCopyFromChunk(chunk) {
|
||||
this._send(serialize.copyData(chunk))
|
||||
}
|
||||
|
||||
endCopyFrom() {
|
||||
this._send(serialize.copyDone())
|
||||
}
|
||||
|
||||
sendCopyFail(msg) {
|
||||
this._send(serialize.copyFail(msg))
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = Connection
|
||||
186
node_modules/pg/lib/crypto/sasl.js
generated
vendored
Normal file
186
node_modules/pg/lib/crypto/sasl.js
generated
vendored
Normal file
@ -0,0 +1,186 @@
|
||||
'use strict'
|
||||
const crypto = require('./utils')
|
||||
|
||||
function startSession(mechanisms) {
|
||||
if (mechanisms.indexOf('SCRAM-SHA-256') === -1) {
|
||||
throw new Error('SASL: Only mechanism SCRAM-SHA-256 is currently supported')
|
||||
}
|
||||
|
||||
const clientNonce = crypto.randomBytes(18).toString('base64')
|
||||
|
||||
return {
|
||||
mechanism: 'SCRAM-SHA-256',
|
||||
clientNonce,
|
||||
response: 'n,,n=*,r=' + clientNonce,
|
||||
message: 'SASLInitialResponse',
|
||||
}
|
||||
}
|
||||
|
||||
async function continueSession(session, password, serverData) {
|
||||
if (session.message !== 'SASLInitialResponse') {
|
||||
throw new Error('SASL: Last message was not SASLInitialResponse')
|
||||
}
|
||||
if (typeof password !== 'string') {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: client password must be a string')
|
||||
}
|
||||
if (password === '') {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: client password must be a non-empty string')
|
||||
}
|
||||
if (typeof serverData !== 'string') {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: serverData must be a string')
|
||||
}
|
||||
|
||||
const sv = parseServerFirstMessage(serverData)
|
||||
|
||||
if (!sv.nonce.startsWith(session.clientNonce)) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: server nonce does not start with client nonce')
|
||||
} else if (sv.nonce.length === session.clientNonce.length) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: server nonce is too short')
|
||||
}
|
||||
|
||||
var clientFirstMessageBare = 'n=*,r=' + session.clientNonce
|
||||
var serverFirstMessage = 'r=' + sv.nonce + ',s=' + sv.salt + ',i=' + sv.iteration
|
||||
var clientFinalMessageWithoutProof = 'c=biws,r=' + sv.nonce
|
||||
var authMessage = clientFirstMessageBare + ',' + serverFirstMessage + ',' + clientFinalMessageWithoutProof
|
||||
|
||||
var saltBytes = Buffer.from(sv.salt, 'base64')
|
||||
var saltedPassword = await crypto.deriveKey(password, saltBytes, sv.iteration)
|
||||
var clientKey = await crypto.hmacSha256(saltedPassword, 'Client Key')
|
||||
var storedKey = await crypto.sha256(clientKey)
|
||||
var clientSignature = await crypto.hmacSha256(storedKey, authMessage)
|
||||
var clientProof = xorBuffers(Buffer.from(clientKey), Buffer.from(clientSignature)).toString('base64')
|
||||
var serverKey = await crypto.hmacSha256(saltedPassword, 'Server Key')
|
||||
var serverSignatureBytes = await crypto.hmacSha256(serverKey, authMessage)
|
||||
|
||||
session.message = 'SASLResponse'
|
||||
session.serverSignature = Buffer.from(serverSignatureBytes).toString('base64')
|
||||
session.response = clientFinalMessageWithoutProof + ',p=' + clientProof
|
||||
}
|
||||
|
||||
function finalizeSession(session, serverData) {
|
||||
if (session.message !== 'SASLResponse') {
|
||||
throw new Error('SASL: Last message was not SASLResponse')
|
||||
}
|
||||
if (typeof serverData !== 'string') {
|
||||
throw new Error('SASL: SCRAM-SERVER-FINAL-MESSAGE: serverData must be a string')
|
||||
}
|
||||
|
||||
const { serverSignature } = parseServerFinalMessage(serverData)
|
||||
|
||||
if (serverSignature !== session.serverSignature) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FINAL-MESSAGE: server signature does not match')
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* printable = %x21-2B / %x2D-7E
|
||||
* ;; Printable ASCII except ",".
|
||||
* ;; Note that any "printable" is also
|
||||
* ;; a valid "value".
|
||||
*/
|
||||
function isPrintableChars(text) {
|
||||
if (typeof text !== 'string') {
|
||||
throw new TypeError('SASL: text must be a string')
|
||||
}
|
||||
return text
|
||||
.split('')
|
||||
.map((_, i) => text.charCodeAt(i))
|
||||
.every((c) => (c >= 0x21 && c <= 0x2b) || (c >= 0x2d && c <= 0x7e))
|
||||
}
|
||||
|
||||
/**
|
||||
* base64-char = ALPHA / DIGIT / "/" / "+"
|
||||
*
|
||||
* base64-4 = 4base64-char
|
||||
*
|
||||
* base64-3 = 3base64-char "="
|
||||
*
|
||||
* base64-2 = 2base64-char "=="
|
||||
*
|
||||
* base64 = *base64-4 [base64-3 / base64-2]
|
||||
*/
|
||||
function isBase64(text) {
|
||||
return /^(?:[a-zA-Z0-9+/]{4})*(?:[a-zA-Z0-9+/]{2}==|[a-zA-Z0-9+/]{3}=)?$/.test(text)
|
||||
}
|
||||
|
||||
function parseAttributePairs(text) {
|
||||
if (typeof text !== 'string') {
|
||||
throw new TypeError('SASL: attribute pairs text must be a string')
|
||||
}
|
||||
|
||||
return new Map(
|
||||
text.split(',').map((attrValue) => {
|
||||
if (!/^.=/.test(attrValue)) {
|
||||
throw new Error('SASL: Invalid attribute pair entry')
|
||||
}
|
||||
const name = attrValue[0]
|
||||
const value = attrValue.substring(2)
|
||||
return [name, value]
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
function parseServerFirstMessage(data) {
|
||||
const attrPairs = parseAttributePairs(data)
|
||||
|
||||
const nonce = attrPairs.get('r')
|
||||
if (!nonce) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: nonce missing')
|
||||
} else if (!isPrintableChars(nonce)) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: nonce must only contain printable characters')
|
||||
}
|
||||
const salt = attrPairs.get('s')
|
||||
if (!salt) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: salt missing')
|
||||
} else if (!isBase64(salt)) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: salt must be base64')
|
||||
}
|
||||
const iterationText = attrPairs.get('i')
|
||||
if (!iterationText) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: iteration missing')
|
||||
} else if (!/^[1-9][0-9]*$/.test(iterationText)) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FIRST-MESSAGE: invalid iteration count')
|
||||
}
|
||||
const iteration = parseInt(iterationText, 10)
|
||||
|
||||
return {
|
||||
nonce,
|
||||
salt,
|
||||
iteration,
|
||||
}
|
||||
}
|
||||
|
||||
function parseServerFinalMessage(serverData) {
|
||||
const attrPairs = parseAttributePairs(serverData)
|
||||
const serverSignature = attrPairs.get('v')
|
||||
if (!serverSignature) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FINAL-MESSAGE: server signature is missing')
|
||||
} else if (!isBase64(serverSignature)) {
|
||||
throw new Error('SASL: SCRAM-SERVER-FINAL-MESSAGE: server signature must be base64')
|
||||
}
|
||||
return {
|
||||
serverSignature,
|
||||
}
|
||||
}
|
||||
|
||||
function xorBuffers(a, b) {
|
||||
if (!Buffer.isBuffer(a)) {
|
||||
throw new TypeError('first argument must be a Buffer')
|
||||
}
|
||||
if (!Buffer.isBuffer(b)) {
|
||||
throw new TypeError('second argument must be a Buffer')
|
||||
}
|
||||
if (a.length !== b.length) {
|
||||
throw new Error('Buffer lengths must match')
|
||||
}
|
||||
if (a.length === 0) {
|
||||
throw new Error('Buffers cannot be empty')
|
||||
}
|
||||
return Buffer.from(a.map((_, i) => a[i] ^ b[i]))
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
startSession,
|
||||
continueSession,
|
||||
finalizeSession,
|
||||
}
|
||||
37
node_modules/pg/lib/crypto/utils-legacy.js
generated
vendored
Normal file
37
node_modules/pg/lib/crypto/utils-legacy.js
generated
vendored
Normal file
@ -0,0 +1,37 @@
|
||||
'use strict'
|
||||
// This file contains crypto utility functions for versions of Node.js < 15.0.0,
|
||||
// which does not support the WebCrypto.subtle API.
|
||||
|
||||
const nodeCrypto = require('crypto')
|
||||
|
||||
function md5(string) {
|
||||
return nodeCrypto.createHash('md5').update(string, 'utf-8').digest('hex')
|
||||
}
|
||||
|
||||
// See AuthenticationMD5Password at https://www.postgresql.org/docs/current/static/protocol-flow.html
|
||||
function postgresMd5PasswordHash(user, password, salt) {
|
||||
var inner = md5(password + user)
|
||||
var outer = md5(Buffer.concat([Buffer.from(inner), salt]))
|
||||
return 'md5' + outer
|
||||
}
|
||||
|
||||
function sha256(text) {
|
||||
return nodeCrypto.createHash('sha256').update(text).digest()
|
||||
}
|
||||
|
||||
function hmacSha256(key, msg) {
|
||||
return nodeCrypto.createHmac('sha256', key).update(msg).digest()
|
||||
}
|
||||
|
||||
async function deriveKey(password, salt, iterations) {
|
||||
return nodeCrypto.pbkdf2Sync(password, salt, iterations, 32, 'sha256')
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
postgresMd5PasswordHash,
|
||||
randomBytes: nodeCrypto.randomBytes,
|
||||
deriveKey,
|
||||
sha256,
|
||||
hmacSha256,
|
||||
md5,
|
||||
}
|
||||
83
node_modules/pg/lib/crypto/utils-webcrypto.js
generated
vendored
Normal file
83
node_modules/pg/lib/crypto/utils-webcrypto.js
generated
vendored
Normal file
@ -0,0 +1,83 @@
|
||||
const nodeCrypto = require('crypto')
|
||||
|
||||
module.exports = {
|
||||
postgresMd5PasswordHash,
|
||||
randomBytes,
|
||||
deriveKey,
|
||||
sha256,
|
||||
hmacSha256,
|
||||
md5,
|
||||
}
|
||||
|
||||
/**
|
||||
* The Web Crypto API - grabbed from the Node.js library or the global
|
||||
* @type Crypto
|
||||
*/
|
||||
const webCrypto = nodeCrypto.webcrypto || globalThis.crypto
|
||||
/**
|
||||
* The SubtleCrypto API for low level crypto operations.
|
||||
* @type SubtleCrypto
|
||||
*/
|
||||
const subtleCrypto = webCrypto.subtle
|
||||
const textEncoder = new TextEncoder()
|
||||
|
||||
/**
|
||||
*
|
||||
* @param {*} length
|
||||
* @returns
|
||||
*/
|
||||
function randomBytes(length) {
|
||||
return webCrypto.getRandomValues(Buffer.alloc(length))
|
||||
}
|
||||
|
||||
async function md5(string) {
|
||||
try {
|
||||
return nodeCrypto.createHash('md5').update(string, 'utf-8').digest('hex')
|
||||
} catch (e) {
|
||||
// `createHash()` failed so we are probably not in Node.js, use the WebCrypto API instead.
|
||||
// Note that the MD5 algorithm on WebCrypto is not available in Node.js.
|
||||
// This is why we cannot just use WebCrypto in all environments.
|
||||
const data = typeof string === 'string' ? textEncoder.encode(string) : string
|
||||
const hash = await subtleCrypto.digest('MD5', data)
|
||||
return Array.from(new Uint8Array(hash))
|
||||
.map((b) => b.toString(16).padStart(2, '0'))
|
||||
.join('')
|
||||
}
|
||||
}
|
||||
|
||||
// See AuthenticationMD5Password at https://www.postgresql.org/docs/current/static/protocol-flow.html
|
||||
async function postgresMd5PasswordHash(user, password, salt) {
|
||||
var inner = await md5(password + user)
|
||||
var outer = await md5(Buffer.concat([Buffer.from(inner), salt]))
|
||||
return 'md5' + outer
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a SHA-256 digest of the given data
|
||||
* @param {Buffer} data
|
||||
*/
|
||||
async function sha256(text) {
|
||||
return await subtleCrypto.digest('SHA-256', text)
|
||||
}
|
||||
|
||||
/**
|
||||
* Sign the message with the given key
|
||||
* @param {ArrayBuffer} keyBuffer
|
||||
* @param {string} msg
|
||||
*/
|
||||
async function hmacSha256(keyBuffer, msg) {
|
||||
const key = await subtleCrypto.importKey('raw', keyBuffer, { name: 'HMAC', hash: 'SHA-256' }, false, ['sign'])
|
||||
return await subtleCrypto.sign('HMAC', key, textEncoder.encode(msg))
|
||||
}
|
||||
|
||||
/**
|
||||
* Derive a key from the password and salt
|
||||
* @param {string} password
|
||||
* @param {Uint8Array} salt
|
||||
* @param {number} iterations
|
||||
*/
|
||||
async function deriveKey(password, salt, iterations) {
|
||||
const key = await subtleCrypto.importKey('raw', textEncoder.encode(password), 'PBKDF2', false, ['deriveBits'])
|
||||
const params = { name: 'PBKDF2', hash: 'SHA-256', salt: salt, iterations: iterations }
|
||||
return await subtleCrypto.deriveBits(params, key, 32 * 8, ['deriveBits'])
|
||||
}
|
||||
9
node_modules/pg/lib/crypto/utils.js
generated
vendored
Normal file
9
node_modules/pg/lib/crypto/utils.js
generated
vendored
Normal file
@ -0,0 +1,9 @@
|
||||
'use strict'
|
||||
|
||||
const useLegacyCrypto = parseInt(process.versions && process.versions.node && process.versions.node.split('.')[0]) < 15
|
||||
if (useLegacyCrypto) {
|
||||
// We are on an old version of Node.js that requires legacy crypto utilities.
|
||||
module.exports = require('./utils-legacy')
|
||||
} else {
|
||||
module.exports = require('./utils-webcrypto')
|
||||
}
|
||||
84
node_modules/pg/lib/defaults.js
generated
vendored
Normal file
84
node_modules/pg/lib/defaults.js
generated
vendored
Normal file
@ -0,0 +1,84 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = {
|
||||
// database host. defaults to localhost
|
||||
host: 'localhost',
|
||||
|
||||
// database user's name
|
||||
user: process.platform === 'win32' ? process.env.USERNAME : process.env.USER,
|
||||
|
||||
// name of database to connect
|
||||
database: undefined,
|
||||
|
||||
// database user's password
|
||||
password: null,
|
||||
|
||||
// a Postgres connection string to be used instead of setting individual connection items
|
||||
// NOTE: Setting this value will cause it to override any other value (such as database or user) defined
|
||||
// in the defaults object.
|
||||
connectionString: undefined,
|
||||
|
||||
// database port
|
||||
port: 5432,
|
||||
|
||||
// number of rows to return at a time from a prepared statement's
|
||||
// portal. 0 will return all rows at once
|
||||
rows: 0,
|
||||
|
||||
// binary result mode
|
||||
binary: false,
|
||||
|
||||
// Connection pool options - see https://github.com/brianc/node-pg-pool
|
||||
|
||||
// number of connections to use in connection pool
|
||||
// 0 will disable connection pooling
|
||||
max: 10,
|
||||
|
||||
// max milliseconds a client can go unused before it is removed
|
||||
// from the pool and destroyed
|
||||
idleTimeoutMillis: 30000,
|
||||
|
||||
client_encoding: '',
|
||||
|
||||
ssl: false,
|
||||
|
||||
application_name: undefined,
|
||||
|
||||
fallback_application_name: undefined,
|
||||
|
||||
options: undefined,
|
||||
|
||||
parseInputDatesAsUTC: false,
|
||||
|
||||
// max milliseconds any query using this connection will execute for before timing out in error.
|
||||
// false=unlimited
|
||||
statement_timeout: false,
|
||||
|
||||
// Abort any statement that waits longer than the specified duration in milliseconds while attempting to acquire a lock.
|
||||
// false=unlimited
|
||||
lock_timeout: false,
|
||||
|
||||
// Terminate any session with an open transaction that has been idle for longer than the specified duration in milliseconds
|
||||
// false=unlimited
|
||||
idle_in_transaction_session_timeout: false,
|
||||
|
||||
// max milliseconds to wait for query to complete (client side)
|
||||
query_timeout: false,
|
||||
|
||||
connect_timeout: 0,
|
||||
|
||||
keepalives: 1,
|
||||
|
||||
keepalives_idle: 0,
|
||||
}
|
||||
|
||||
var pgTypes = require('pg-types')
|
||||
// save default parsers
|
||||
var parseBigInteger = pgTypes.getTypeParser(20, 'text')
|
||||
var parseBigIntegerArray = pgTypes.getTypeParser(1016, 'text')
|
||||
|
||||
// parse int8 so you can get your count values as actual numbers
|
||||
module.exports.__defineSetter__('parseInt8', function (val) {
|
||||
pgTypes.setTypeParser(20, 'text', val ? pgTypes.getTypeParser(23, 'text') : parseBigInteger)
|
||||
pgTypes.setTypeParser(1016, 'text', val ? pgTypes.getTypeParser(1007, 'text') : parseBigIntegerArray)
|
||||
})
|
||||
58
node_modules/pg/lib/index.js
generated
vendored
Normal file
58
node_modules/pg/lib/index.js
generated
vendored
Normal file
@ -0,0 +1,58 @@
|
||||
'use strict'
|
||||
|
||||
var Client = require('./client')
|
||||
var defaults = require('./defaults')
|
||||
var Connection = require('./connection')
|
||||
var Pool = require('pg-pool')
|
||||
const { DatabaseError } = require('pg-protocol')
|
||||
const { escapeIdentifier, escapeLiteral } = require('./utils')
|
||||
|
||||
const poolFactory = (Client) => {
|
||||
return class BoundPool extends Pool {
|
||||
constructor(options) {
|
||||
super(options, Client)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var PG = function (clientConstructor) {
|
||||
this.defaults = defaults
|
||||
this.Client = clientConstructor
|
||||
this.Query = this.Client.Query
|
||||
this.Pool = poolFactory(this.Client)
|
||||
this._pools = []
|
||||
this.Connection = Connection
|
||||
this.types = require('pg-types')
|
||||
this.DatabaseError = DatabaseError
|
||||
this.escapeIdentifier = escapeIdentifier
|
||||
this.escapeLiteral = escapeLiteral
|
||||
}
|
||||
|
||||
if (typeof process.env.NODE_PG_FORCE_NATIVE !== 'undefined') {
|
||||
module.exports = new PG(require('./native'))
|
||||
} else {
|
||||
module.exports = new PG(Client)
|
||||
|
||||
// lazy require native module...the native module may not have installed
|
||||
Object.defineProperty(module.exports, 'native', {
|
||||
configurable: true,
|
||||
enumerable: false,
|
||||
get() {
|
||||
var native = null
|
||||
try {
|
||||
native = new PG(require('./native'))
|
||||
} catch (err) {
|
||||
if (err.code !== 'MODULE_NOT_FOUND') {
|
||||
throw err
|
||||
}
|
||||
}
|
||||
|
||||
// overwrite module.exports.native so that getter is never called again
|
||||
Object.defineProperty(module.exports, 'native', {
|
||||
value: native,
|
||||
})
|
||||
|
||||
return native
|
||||
},
|
||||
})
|
||||
}
|
||||
307
node_modules/pg/lib/native/client.js
generated
vendored
Normal file
307
node_modules/pg/lib/native/client.js
generated
vendored
Normal file
@ -0,0 +1,307 @@
|
||||
'use strict'
|
||||
|
||||
// eslint-disable-next-line
|
||||
var Native
|
||||
try {
|
||||
// Wrap this `require()` in a try-catch to avoid upstream bundlers from complaining that this might not be available since it is an optional import
|
||||
Native = require('pg-native')
|
||||
} catch (e) {
|
||||
throw e
|
||||
}
|
||||
var TypeOverrides = require('../type-overrides')
|
||||
var EventEmitter = require('events').EventEmitter
|
||||
var util = require('util')
|
||||
var ConnectionParameters = require('../connection-parameters')
|
||||
|
||||
var NativeQuery = require('./query')
|
||||
|
||||
var Client = (module.exports = function (config) {
|
||||
EventEmitter.call(this)
|
||||
config = config || {}
|
||||
|
||||
this._Promise = config.Promise || global.Promise
|
||||
this._types = new TypeOverrides(config.types)
|
||||
|
||||
this.native = new Native({
|
||||
types: this._types,
|
||||
})
|
||||
|
||||
this._queryQueue = []
|
||||
this._ending = false
|
||||
this._connecting = false
|
||||
this._connected = false
|
||||
this._queryable = true
|
||||
|
||||
// keep these on the object for legacy reasons
|
||||
// for the time being. TODO: deprecate all this jazz
|
||||
var cp = (this.connectionParameters = new ConnectionParameters(config))
|
||||
if (config.nativeConnectionString) cp.nativeConnectionString = config.nativeConnectionString
|
||||
this.user = cp.user
|
||||
|
||||
// "hiding" the password so it doesn't show up in stack traces
|
||||
// or if the client is console.logged
|
||||
Object.defineProperty(this, 'password', {
|
||||
configurable: true,
|
||||
enumerable: false,
|
||||
writable: true,
|
||||
value: cp.password,
|
||||
})
|
||||
this.database = cp.database
|
||||
this.host = cp.host
|
||||
this.port = cp.port
|
||||
|
||||
// a hash to hold named queries
|
||||
this.namedQueries = {}
|
||||
})
|
||||
|
||||
Client.Query = NativeQuery
|
||||
|
||||
util.inherits(Client, EventEmitter)
|
||||
|
||||
Client.prototype._errorAllQueries = function (err) {
|
||||
const enqueueError = (query) => {
|
||||
process.nextTick(() => {
|
||||
query.native = this.native
|
||||
query.handleError(err)
|
||||
})
|
||||
}
|
||||
|
||||
if (this._hasActiveQuery()) {
|
||||
enqueueError(this._activeQuery)
|
||||
this._activeQuery = null
|
||||
}
|
||||
|
||||
this._queryQueue.forEach(enqueueError)
|
||||
this._queryQueue.length = 0
|
||||
}
|
||||
|
||||
// connect to the backend
|
||||
// pass an optional callback to be called once connected
|
||||
// or with an error if there was a connection error
|
||||
Client.prototype._connect = function (cb) {
|
||||
var self = this
|
||||
|
||||
if (this._connecting) {
|
||||
process.nextTick(() => cb(new Error('Client has already been connected. You cannot reuse a client.')))
|
||||
return
|
||||
}
|
||||
|
||||
this._connecting = true
|
||||
|
||||
this.connectionParameters.getLibpqConnectionString(function (err, conString) {
|
||||
if (self.connectionParameters.nativeConnectionString) conString = self.connectionParameters.nativeConnectionString
|
||||
if (err) return cb(err)
|
||||
self.native.connect(conString, function (err) {
|
||||
if (err) {
|
||||
self.native.end()
|
||||
return cb(err)
|
||||
}
|
||||
|
||||
// set internal states to connected
|
||||
self._connected = true
|
||||
|
||||
// handle connection errors from the native layer
|
||||
self.native.on('error', function (err) {
|
||||
self._queryable = false
|
||||
self._errorAllQueries(err)
|
||||
self.emit('error', err)
|
||||
})
|
||||
|
||||
self.native.on('notification', function (msg) {
|
||||
self.emit('notification', {
|
||||
channel: msg.relname,
|
||||
payload: msg.extra,
|
||||
})
|
||||
})
|
||||
|
||||
// signal we are connected now
|
||||
self.emit('connect')
|
||||
self._pulseQueryQueue(true)
|
||||
|
||||
cb()
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
Client.prototype.connect = function (callback) {
|
||||
if (callback) {
|
||||
this._connect(callback)
|
||||
return
|
||||
}
|
||||
|
||||
return new this._Promise((resolve, reject) => {
|
||||
this._connect((error) => {
|
||||
if (error) {
|
||||
reject(error)
|
||||
} else {
|
||||
resolve()
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
// send a query to the server
|
||||
// this method is highly overloaded to take
|
||||
// 1) string query, optional array of parameters, optional function callback
|
||||
// 2) object query with {
|
||||
// string query
|
||||
// optional array values,
|
||||
// optional function callback instead of as a separate parameter
|
||||
// optional string name to name & cache the query plan
|
||||
// optional string rowMode = 'array' for an array of results
|
||||
// }
|
||||
Client.prototype.query = function (config, values, callback) {
|
||||
var query
|
||||
var result
|
||||
var readTimeout
|
||||
var readTimeoutTimer
|
||||
var queryCallback
|
||||
|
||||
if (config === null || config === undefined) {
|
||||
throw new TypeError('Client was passed a null or undefined query')
|
||||
} else if (typeof config.submit === 'function') {
|
||||
readTimeout = config.query_timeout || this.connectionParameters.query_timeout
|
||||
result = query = config
|
||||
// accept query(new Query(...), (err, res) => { }) style
|
||||
if (typeof values === 'function') {
|
||||
config.callback = values
|
||||
}
|
||||
} else {
|
||||
readTimeout = config.query_timeout || this.connectionParameters.query_timeout
|
||||
query = new NativeQuery(config, values, callback)
|
||||
if (!query.callback) {
|
||||
let resolveOut, rejectOut
|
||||
result = new this._Promise((resolve, reject) => {
|
||||
resolveOut = resolve
|
||||
rejectOut = reject
|
||||
}).catch((err) => {
|
||||
Error.captureStackTrace(err)
|
||||
throw err
|
||||
})
|
||||
query.callback = (err, res) => (err ? rejectOut(err) : resolveOut(res))
|
||||
}
|
||||
}
|
||||
|
||||
if (readTimeout) {
|
||||
queryCallback = query.callback
|
||||
|
||||
readTimeoutTimer = setTimeout(() => {
|
||||
var error = new Error('Query read timeout')
|
||||
|
||||
process.nextTick(() => {
|
||||
query.handleError(error, this.connection)
|
||||
})
|
||||
|
||||
queryCallback(error)
|
||||
|
||||
// we already returned an error,
|
||||
// just do nothing if query completes
|
||||
query.callback = () => {}
|
||||
|
||||
// Remove from queue
|
||||
var index = this._queryQueue.indexOf(query)
|
||||
if (index > -1) {
|
||||
this._queryQueue.splice(index, 1)
|
||||
}
|
||||
|
||||
this._pulseQueryQueue()
|
||||
}, readTimeout)
|
||||
|
||||
query.callback = (err, res) => {
|
||||
clearTimeout(readTimeoutTimer)
|
||||
queryCallback(err, res)
|
||||
}
|
||||
}
|
||||
|
||||
if (!this._queryable) {
|
||||
query.native = this.native
|
||||
process.nextTick(() => {
|
||||
query.handleError(new Error('Client has encountered a connection error and is not queryable'))
|
||||
})
|
||||
return result
|
||||
}
|
||||
|
||||
if (this._ending) {
|
||||
query.native = this.native
|
||||
process.nextTick(() => {
|
||||
query.handleError(new Error('Client was closed and is not queryable'))
|
||||
})
|
||||
return result
|
||||
}
|
||||
|
||||
this._queryQueue.push(query)
|
||||
this._pulseQueryQueue()
|
||||
return result
|
||||
}
|
||||
|
||||
// disconnect from the backend server
|
||||
Client.prototype.end = function (cb) {
|
||||
var self = this
|
||||
|
||||
this._ending = true
|
||||
|
||||
if (!this._connected) {
|
||||
this.once('connect', this.end.bind(this, cb))
|
||||
}
|
||||
var result
|
||||
if (!cb) {
|
||||
result = new this._Promise(function (resolve, reject) {
|
||||
cb = (err) => (err ? reject(err) : resolve())
|
||||
})
|
||||
}
|
||||
this.native.end(function () {
|
||||
self._errorAllQueries(new Error('Connection terminated'))
|
||||
|
||||
process.nextTick(() => {
|
||||
self.emit('end')
|
||||
if (cb) cb()
|
||||
})
|
||||
})
|
||||
return result
|
||||
}
|
||||
|
||||
Client.prototype._hasActiveQuery = function () {
|
||||
return this._activeQuery && this._activeQuery.state !== 'error' && this._activeQuery.state !== 'end'
|
||||
}
|
||||
|
||||
Client.prototype._pulseQueryQueue = function (initialConnection) {
|
||||
if (!this._connected) {
|
||||
return
|
||||
}
|
||||
if (this._hasActiveQuery()) {
|
||||
return
|
||||
}
|
||||
var query = this._queryQueue.shift()
|
||||
if (!query) {
|
||||
if (!initialConnection) {
|
||||
this.emit('drain')
|
||||
}
|
||||
return
|
||||
}
|
||||
this._activeQuery = query
|
||||
query.submit(this)
|
||||
var self = this
|
||||
query.once('_done', function () {
|
||||
self._pulseQueryQueue()
|
||||
})
|
||||
}
|
||||
|
||||
// attempt to cancel an in-progress query
|
||||
Client.prototype.cancel = function (query) {
|
||||
if (this._activeQuery === query) {
|
||||
this.native.cancel(function () {})
|
||||
} else if (this._queryQueue.indexOf(query) !== -1) {
|
||||
this._queryQueue.splice(this._queryQueue.indexOf(query), 1)
|
||||
}
|
||||
}
|
||||
|
||||
Client.prototype.ref = function () {}
|
||||
Client.prototype.unref = function () {}
|
||||
|
||||
Client.prototype.setTypeParser = function (oid, format, parseFn) {
|
||||
return this._types.setTypeParser(oid, format, parseFn)
|
||||
}
|
||||
|
||||
Client.prototype.getTypeParser = function (oid, format) {
|
||||
return this._types.getTypeParser(oid, format)
|
||||
}
|
||||
2
node_modules/pg/lib/native/index.js
generated
vendored
Normal file
2
node_modules/pg/lib/native/index.js
generated
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
'use strict'
|
||||
module.exports = require('./client')
|
||||
168
node_modules/pg/lib/native/query.js
generated
vendored
Normal file
168
node_modules/pg/lib/native/query.js
generated
vendored
Normal file
@ -0,0 +1,168 @@
|
||||
'use strict'
|
||||
|
||||
var EventEmitter = require('events').EventEmitter
|
||||
var util = require('util')
|
||||
var utils = require('../utils')
|
||||
|
||||
var NativeQuery = (module.exports = function (config, values, callback) {
|
||||
EventEmitter.call(this)
|
||||
config = utils.normalizeQueryConfig(config, values, callback)
|
||||
this.text = config.text
|
||||
this.values = config.values
|
||||
this.name = config.name
|
||||
this.queryMode = config.queryMode
|
||||
this.callback = config.callback
|
||||
this.state = 'new'
|
||||
this._arrayMode = config.rowMode === 'array'
|
||||
|
||||
// if the 'row' event is listened for
|
||||
// then emit them as they come in
|
||||
// without setting singleRowMode to true
|
||||
// this has almost no meaning because libpq
|
||||
// reads all rows into memory befor returning any
|
||||
this._emitRowEvents = false
|
||||
this.on(
|
||||
'newListener',
|
||||
function (event) {
|
||||
if (event === 'row') this._emitRowEvents = true
|
||||
}.bind(this)
|
||||
)
|
||||
})
|
||||
|
||||
util.inherits(NativeQuery, EventEmitter)
|
||||
|
||||
var errorFieldMap = {
|
||||
/* eslint-disable quote-props */
|
||||
sqlState: 'code',
|
||||
statementPosition: 'position',
|
||||
messagePrimary: 'message',
|
||||
context: 'where',
|
||||
schemaName: 'schema',
|
||||
tableName: 'table',
|
||||
columnName: 'column',
|
||||
dataTypeName: 'dataType',
|
||||
constraintName: 'constraint',
|
||||
sourceFile: 'file',
|
||||
sourceLine: 'line',
|
||||
sourceFunction: 'routine',
|
||||
}
|
||||
|
||||
NativeQuery.prototype.handleError = function (err) {
|
||||
// copy pq error fields into the error object
|
||||
var fields = this.native.pq.resultErrorFields()
|
||||
if (fields) {
|
||||
for (var key in fields) {
|
||||
var normalizedFieldName = errorFieldMap[key] || key
|
||||
err[normalizedFieldName] = fields[key]
|
||||
}
|
||||
}
|
||||
if (this.callback) {
|
||||
this.callback(err)
|
||||
} else {
|
||||
this.emit('error', err)
|
||||
}
|
||||
this.state = 'error'
|
||||
}
|
||||
|
||||
NativeQuery.prototype.then = function (onSuccess, onFailure) {
|
||||
return this._getPromise().then(onSuccess, onFailure)
|
||||
}
|
||||
|
||||
NativeQuery.prototype.catch = function (callback) {
|
||||
return this._getPromise().catch(callback)
|
||||
}
|
||||
|
||||
NativeQuery.prototype._getPromise = function () {
|
||||
if (this._promise) return this._promise
|
||||
this._promise = new Promise(
|
||||
function (resolve, reject) {
|
||||
this._once('end', resolve)
|
||||
this._once('error', reject)
|
||||
}.bind(this)
|
||||
)
|
||||
return this._promise
|
||||
}
|
||||
|
||||
NativeQuery.prototype.submit = function (client) {
|
||||
this.state = 'running'
|
||||
var self = this
|
||||
this.native = client.native
|
||||
client.native.arrayMode = this._arrayMode
|
||||
|
||||
var after = function (err, rows, results) {
|
||||
client.native.arrayMode = false
|
||||
setImmediate(function () {
|
||||
self.emit('_done')
|
||||
})
|
||||
|
||||
// handle possible query error
|
||||
if (err) {
|
||||
return self.handleError(err)
|
||||
}
|
||||
|
||||
// emit row events for each row in the result
|
||||
if (self._emitRowEvents) {
|
||||
if (results.length > 1) {
|
||||
rows.forEach((rowOfRows, i) => {
|
||||
rowOfRows.forEach((row) => {
|
||||
self.emit('row', row, results[i])
|
||||
})
|
||||
})
|
||||
} else {
|
||||
rows.forEach(function (row) {
|
||||
self.emit('row', row, results)
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// handle successful result
|
||||
self.state = 'end'
|
||||
self.emit('end', results)
|
||||
if (self.callback) {
|
||||
self.callback(null, results)
|
||||
}
|
||||
}
|
||||
|
||||
if (process.domain) {
|
||||
after = process.domain.bind(after)
|
||||
}
|
||||
|
||||
// named query
|
||||
if (this.name) {
|
||||
if (this.name.length > 63) {
|
||||
/* eslint-disable no-console */
|
||||
console.error('Warning! Postgres only supports 63 characters for query names.')
|
||||
console.error('You supplied %s (%s)', this.name, this.name.length)
|
||||
console.error('This can cause conflicts and silent errors executing queries')
|
||||
/* eslint-enable no-console */
|
||||
}
|
||||
var values = (this.values || []).map(utils.prepareValue)
|
||||
|
||||
// check if the client has already executed this named query
|
||||
// if so...just execute it again - skip the planning phase
|
||||
if (client.namedQueries[this.name]) {
|
||||
if (this.text && client.namedQueries[this.name] !== this.text) {
|
||||
const err = new Error(`Prepared statements must be unique - '${this.name}' was used for a different statement`)
|
||||
return after(err)
|
||||
}
|
||||
return client.native.execute(this.name, values, after)
|
||||
}
|
||||
// plan the named query the first time, then execute it
|
||||
return client.native.prepare(this.name, this.text, values.length, function (err) {
|
||||
if (err) return after(err)
|
||||
client.namedQueries[self.name] = self.text
|
||||
return self.native.execute(self.name, values, after)
|
||||
})
|
||||
} else if (this.values) {
|
||||
if (!Array.isArray(this.values)) {
|
||||
const err = new Error('Query values must be an array')
|
||||
return after(err)
|
||||
}
|
||||
var vals = this.values.map(utils.prepareValue)
|
||||
client.native.query(this.text, vals, after)
|
||||
} else if (this.queryMode === 'extended') {
|
||||
client.native.query(this.text, [], after)
|
||||
} else {
|
||||
client.native.query(this.text, after)
|
||||
}
|
||||
}
|
||||
239
node_modules/pg/lib/query.js
generated
vendored
Normal file
239
node_modules/pg/lib/query.js
generated
vendored
Normal file
@ -0,0 +1,239 @@
|
||||
'use strict'
|
||||
|
||||
const { EventEmitter } = require('events')
|
||||
|
||||
const Result = require('./result')
|
||||
const utils = require('./utils')
|
||||
|
||||
class Query extends EventEmitter {
|
||||
constructor(config, values, callback) {
|
||||
super()
|
||||
|
||||
config = utils.normalizeQueryConfig(config, values, callback)
|
||||
|
||||
this.text = config.text
|
||||
this.values = config.values
|
||||
this.rows = config.rows
|
||||
this.types = config.types
|
||||
this.name = config.name
|
||||
this.queryMode = config.queryMode
|
||||
this.binary = config.binary
|
||||
// use unique portal name each time
|
||||
this.portal = config.portal || ''
|
||||
this.callback = config.callback
|
||||
this._rowMode = config.rowMode
|
||||
if (process.domain && config.callback) {
|
||||
this.callback = process.domain.bind(config.callback)
|
||||
}
|
||||
this._result = new Result(this._rowMode, this.types)
|
||||
|
||||
// potential for multiple results
|
||||
this._results = this._result
|
||||
this._canceledDueToError = false
|
||||
}
|
||||
|
||||
requiresPreparation() {
|
||||
if (this.queryMode === 'extended') {
|
||||
return true
|
||||
}
|
||||
|
||||
// named queries must always be prepared
|
||||
if (this.name) {
|
||||
return true
|
||||
}
|
||||
// always prepare if there are max number of rows expected per
|
||||
// portal execution
|
||||
if (this.rows) {
|
||||
return true
|
||||
}
|
||||
// don't prepare empty text queries
|
||||
if (!this.text) {
|
||||
return false
|
||||
}
|
||||
// prepare if there are values
|
||||
if (!this.values) {
|
||||
return false
|
||||
}
|
||||
return this.values.length > 0
|
||||
}
|
||||
|
||||
_checkForMultirow() {
|
||||
// if we already have a result with a command property
|
||||
// then we've already executed one query in a multi-statement simple query
|
||||
// turn our results into an array of results
|
||||
if (this._result.command) {
|
||||
if (!Array.isArray(this._results)) {
|
||||
this._results = [this._result]
|
||||
}
|
||||
this._result = new Result(this._rowMode, this._result._types)
|
||||
this._results.push(this._result)
|
||||
}
|
||||
}
|
||||
|
||||
// associates row metadata from the supplied
|
||||
// message with this query object
|
||||
// metadata used when parsing row results
|
||||
handleRowDescription(msg) {
|
||||
this._checkForMultirow()
|
||||
this._result.addFields(msg.fields)
|
||||
this._accumulateRows = this.callback || !this.listeners('row').length
|
||||
}
|
||||
|
||||
handleDataRow(msg) {
|
||||
let row
|
||||
|
||||
if (this._canceledDueToError) {
|
||||
return
|
||||
}
|
||||
|
||||
try {
|
||||
row = this._result.parseRow(msg.fields)
|
||||
} catch (err) {
|
||||
this._canceledDueToError = err
|
||||
return
|
||||
}
|
||||
|
||||
this.emit('row', row, this._result)
|
||||
if (this._accumulateRows) {
|
||||
this._result.addRow(row)
|
||||
}
|
||||
}
|
||||
|
||||
handleCommandComplete(msg, connection) {
|
||||
this._checkForMultirow()
|
||||
this._result.addCommandComplete(msg)
|
||||
// need to sync after each command complete of a prepared statement
|
||||
// if we were using a row count which results in multiple calls to _getRows
|
||||
if (this.rows) {
|
||||
connection.sync()
|
||||
}
|
||||
}
|
||||
|
||||
// if a named prepared statement is created with empty query text
|
||||
// the backend will send an emptyQuery message but *not* a command complete message
|
||||
// since we pipeline sync immediately after execute we don't need to do anything here
|
||||
// unless we have rows specified, in which case we did not pipeline the intial sync call
|
||||
handleEmptyQuery(connection) {
|
||||
if (this.rows) {
|
||||
connection.sync()
|
||||
}
|
||||
}
|
||||
|
||||
handleError(err, connection) {
|
||||
// need to sync after error during a prepared statement
|
||||
if (this._canceledDueToError) {
|
||||
err = this._canceledDueToError
|
||||
this._canceledDueToError = false
|
||||
}
|
||||
// if callback supplied do not emit error event as uncaught error
|
||||
// events will bubble up to node process
|
||||
if (this.callback) {
|
||||
return this.callback(err)
|
||||
}
|
||||
this.emit('error', err)
|
||||
}
|
||||
|
||||
handleReadyForQuery(con) {
|
||||
if (this._canceledDueToError) {
|
||||
return this.handleError(this._canceledDueToError, con)
|
||||
}
|
||||
if (this.callback) {
|
||||
try {
|
||||
this.callback(null, this._results)
|
||||
} catch (err) {
|
||||
process.nextTick(() => {
|
||||
throw err
|
||||
})
|
||||
}
|
||||
}
|
||||
this.emit('end', this._results)
|
||||
}
|
||||
|
||||
submit(connection) {
|
||||
if (typeof this.text !== 'string' && typeof this.name !== 'string') {
|
||||
return new Error('A query must have either text or a name. Supplying neither is unsupported.')
|
||||
}
|
||||
const previous = connection.parsedStatements[this.name]
|
||||
if (this.text && previous && this.text !== previous) {
|
||||
return new Error(`Prepared statements must be unique - '${this.name}' was used for a different statement`)
|
||||
}
|
||||
if (this.values && !Array.isArray(this.values)) {
|
||||
return new Error('Query values must be an array')
|
||||
}
|
||||
if (this.requiresPreparation()) {
|
||||
this.prepare(connection)
|
||||
} else {
|
||||
connection.query(this.text)
|
||||
}
|
||||
return null
|
||||
}
|
||||
|
||||
hasBeenParsed(connection) {
|
||||
return this.name && connection.parsedStatements[this.name]
|
||||
}
|
||||
|
||||
handlePortalSuspended(connection) {
|
||||
this._getRows(connection, this.rows)
|
||||
}
|
||||
|
||||
_getRows(connection, rows) {
|
||||
connection.execute({
|
||||
portal: this.portal,
|
||||
rows: rows,
|
||||
})
|
||||
// if we're not reading pages of rows send the sync command
|
||||
// to indicate the pipeline is finished
|
||||
if (!rows) {
|
||||
connection.sync()
|
||||
} else {
|
||||
// otherwise flush the call out to read more rows
|
||||
connection.flush()
|
||||
}
|
||||
}
|
||||
|
||||
// http://developer.postgresql.org/pgdocs/postgres/protocol-flow.html#PROTOCOL-FLOW-EXT-QUERY
|
||||
prepare(connection) {
|
||||
// TODO refactor this poor encapsulation
|
||||
if (!this.hasBeenParsed(connection)) {
|
||||
connection.parse({
|
||||
text: this.text,
|
||||
name: this.name,
|
||||
types: this.types,
|
||||
})
|
||||
}
|
||||
|
||||
// because we're mapping user supplied values to
|
||||
// postgres wire protocol compatible values it could
|
||||
// throw an exception, so try/catch this section
|
||||
try {
|
||||
connection.bind({
|
||||
portal: this.portal,
|
||||
statement: this.name,
|
||||
values: this.values,
|
||||
binary: this.binary,
|
||||
valueMapper: utils.prepareValue,
|
||||
})
|
||||
} catch (err) {
|
||||
this.handleError(err, connection)
|
||||
return
|
||||
}
|
||||
|
||||
connection.describe({
|
||||
type: 'P',
|
||||
name: this.portal || '',
|
||||
})
|
||||
|
||||
this._getRows(connection, this.rows)
|
||||
}
|
||||
|
||||
handleCopyInResponse(connection) {
|
||||
connection.sendCopyFail('No source stream defined')
|
||||
}
|
||||
|
||||
// eslint-disable-next-line no-unused-vars
|
||||
handleCopyData(msg, connection) {
|
||||
// noop
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = Query
|
||||
107
node_modules/pg/lib/result.js
generated
vendored
Normal file
107
node_modules/pg/lib/result.js
generated
vendored
Normal file
@ -0,0 +1,107 @@
|
||||
'use strict'
|
||||
|
||||
var types = require('pg-types')
|
||||
|
||||
var matchRegexp = /^([A-Za-z]+)(?: (\d+))?(?: (\d+))?/
|
||||
|
||||
// result object returned from query
|
||||
// in the 'end' event and also
|
||||
// passed as second argument to provided callback
|
||||
class Result {
|
||||
constructor(rowMode, types) {
|
||||
this.command = null
|
||||
this.rowCount = null
|
||||
this.oid = null
|
||||
this.rows = []
|
||||
this.fields = []
|
||||
this._parsers = undefined
|
||||
this._types = types
|
||||
this.RowCtor = null
|
||||
this.rowAsArray = rowMode === 'array'
|
||||
if (this.rowAsArray) {
|
||||
this.parseRow = this._parseRowAsArray
|
||||
}
|
||||
this._prebuiltEmptyResultObject = null
|
||||
}
|
||||
|
||||
// adds a command complete message
|
||||
addCommandComplete(msg) {
|
||||
var match
|
||||
if (msg.text) {
|
||||
// pure javascript
|
||||
match = matchRegexp.exec(msg.text)
|
||||
} else {
|
||||
// native bindings
|
||||
match = matchRegexp.exec(msg.command)
|
||||
}
|
||||
if (match) {
|
||||
this.command = match[1]
|
||||
if (match[3]) {
|
||||
// COMMMAND OID ROWS
|
||||
this.oid = parseInt(match[2], 10)
|
||||
this.rowCount = parseInt(match[3], 10)
|
||||
} else if (match[2]) {
|
||||
// COMMAND ROWS
|
||||
this.rowCount = parseInt(match[2], 10)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_parseRowAsArray(rowData) {
|
||||
var row = new Array(rowData.length)
|
||||
for (var i = 0, len = rowData.length; i < len; i++) {
|
||||
var rawValue = rowData[i]
|
||||
if (rawValue !== null) {
|
||||
row[i] = this._parsers[i](rawValue)
|
||||
} else {
|
||||
row[i] = null
|
||||
}
|
||||
}
|
||||
return row
|
||||
}
|
||||
|
||||
parseRow(rowData) {
|
||||
var row = { ...this._prebuiltEmptyResultObject }
|
||||
for (var i = 0, len = rowData.length; i < len; i++) {
|
||||
var rawValue = rowData[i]
|
||||
var field = this.fields[i].name
|
||||
if (rawValue !== null) {
|
||||
row[field] = this._parsers[i](rawValue)
|
||||
} else {
|
||||
row[field] = null
|
||||
}
|
||||
}
|
||||
return row
|
||||
}
|
||||
|
||||
addRow(row) {
|
||||
this.rows.push(row)
|
||||
}
|
||||
|
||||
addFields(fieldDescriptions) {
|
||||
// clears field definitions
|
||||
// multiple query statements in 1 action can result in multiple sets
|
||||
// of rowDescriptions...eg: 'select NOW(); select 1::int;'
|
||||
// you need to reset the fields
|
||||
this.fields = fieldDescriptions
|
||||
if (this.fields.length) {
|
||||
this._parsers = new Array(fieldDescriptions.length)
|
||||
}
|
||||
|
||||
var row = {}
|
||||
|
||||
for (var i = 0; i < fieldDescriptions.length; i++) {
|
||||
var desc = fieldDescriptions[i]
|
||||
row[desc.name] = null
|
||||
|
||||
if (this._types) {
|
||||
this._parsers[i] = this._types.getTypeParser(desc.dataTypeID, desc.format || 'text')
|
||||
} else {
|
||||
this._parsers[i] = types.getTypeParser(desc.dataTypeID, desc.format || 'text')
|
||||
}
|
||||
}
|
||||
this._prebuiltEmptyResultObject = { ...row }
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = Result
|
||||
81
node_modules/pg/lib/stream.js
generated
vendored
Normal file
81
node_modules/pg/lib/stream.js
generated
vendored
Normal file
@ -0,0 +1,81 @@
|
||||
const { getStream, getSecureStream } = getStreamFuncs()
|
||||
|
||||
module.exports = {
|
||||
/**
|
||||
* Get a socket stream compatible with the current runtime environment.
|
||||
* @returns {Duplex}
|
||||
*/
|
||||
getStream,
|
||||
/**
|
||||
* Get a TLS secured socket, compatible with the current environment,
|
||||
* using the socket and other settings given in `options`.
|
||||
* @returns {Duplex}
|
||||
*/
|
||||
getSecureStream,
|
||||
}
|
||||
|
||||
/**
|
||||
* The stream functions that work in Node.js
|
||||
*/
|
||||
function getNodejsStreamFuncs() {
|
||||
function getStream(ssl) {
|
||||
const net = require('net')
|
||||
return new net.Socket()
|
||||
}
|
||||
|
||||
function getSecureStream(options) {
|
||||
var tls = require('tls')
|
||||
return tls.connect(options)
|
||||
}
|
||||
return {
|
||||
getStream,
|
||||
getSecureStream,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* The stream functions that work in Cloudflare Workers
|
||||
*/
|
||||
function getCloudflareStreamFuncs() {
|
||||
function getStream(ssl) {
|
||||
const { CloudflareSocket } = require('pg-cloudflare')
|
||||
return new CloudflareSocket(ssl)
|
||||
}
|
||||
|
||||
function getSecureStream(options) {
|
||||
options.socket.startTls(options)
|
||||
return options.socket
|
||||
}
|
||||
return {
|
||||
getStream,
|
||||
getSecureStream,
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Are we running in a Cloudflare Worker?
|
||||
*
|
||||
* @returns true if the code is currently running inside a Cloudflare Worker.
|
||||
*/
|
||||
function isCloudflareRuntime() {
|
||||
// Since 2022-03-21 the `global_navigator` compatibility flag is on for Cloudflare Workers
|
||||
// which means that `navigator.userAgent` will be defined.
|
||||
if (typeof navigator === 'object' && navigator !== null && typeof navigator.userAgent === 'string') {
|
||||
return navigator.userAgent === 'Cloudflare-Workers'
|
||||
}
|
||||
// In case `navigator` or `navigator.userAgent` is not defined then try a more sneaky approach
|
||||
if (typeof Response === 'function') {
|
||||
const resp = new Response(null, { cf: { thing: true } })
|
||||
if (typeof resp.cf === 'object' && resp.cf !== null && resp.cf.thing) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
function getStreamFuncs() {
|
||||
if (isCloudflareRuntime()) {
|
||||
return getCloudflareStreamFuncs()
|
||||
}
|
||||
return getNodejsStreamFuncs()
|
||||
}
|
||||
35
node_modules/pg/lib/type-overrides.js
generated
vendored
Normal file
35
node_modules/pg/lib/type-overrides.js
generated
vendored
Normal file
@ -0,0 +1,35 @@
|
||||
'use strict'
|
||||
|
||||
var types = require('pg-types')
|
||||
|
||||
function TypeOverrides(userTypes) {
|
||||
this._types = userTypes || types
|
||||
this.text = {}
|
||||
this.binary = {}
|
||||
}
|
||||
|
||||
TypeOverrides.prototype.getOverrides = function (format) {
|
||||
switch (format) {
|
||||
case 'text':
|
||||
return this.text
|
||||
case 'binary':
|
||||
return this.binary
|
||||
default:
|
||||
return {}
|
||||
}
|
||||
}
|
||||
|
||||
TypeOverrides.prototype.setTypeParser = function (oid, format, parseFn) {
|
||||
if (typeof format === 'function') {
|
||||
parseFn = format
|
||||
format = 'text'
|
||||
}
|
||||
this.getOverrides(format)[oid] = parseFn
|
||||
}
|
||||
|
||||
TypeOverrides.prototype.getTypeParser = function (oid, format) {
|
||||
format = format || 'text'
|
||||
return this.getOverrides(format)[oid] || this._types.getTypeParser(oid, format)
|
||||
}
|
||||
|
||||
module.exports = TypeOverrides
|
||||
213
node_modules/pg/lib/utils.js
generated
vendored
Normal file
213
node_modules/pg/lib/utils.js
generated
vendored
Normal file
@ -0,0 +1,213 @@
|
||||
'use strict'
|
||||
|
||||
const defaults = require('./defaults')
|
||||
|
||||
function escapeElement(elementRepresentation) {
|
||||
var escaped = elementRepresentation.replace(/\\/g, '\\\\').replace(/"/g, '\\"')
|
||||
|
||||
return '"' + escaped + '"'
|
||||
}
|
||||
|
||||
// convert a JS array to a postgres array literal
|
||||
// uses comma separator so won't work for types like box that use
|
||||
// a different array separator.
|
||||
function arrayString(val) {
|
||||
var result = '{'
|
||||
for (var i = 0; i < val.length; i++) {
|
||||
if (i > 0) {
|
||||
result = result + ','
|
||||
}
|
||||
if (val[i] === null || typeof val[i] === 'undefined') {
|
||||
result = result + 'NULL'
|
||||
} else if (Array.isArray(val[i])) {
|
||||
result = result + arrayString(val[i])
|
||||
} else if (ArrayBuffer.isView(val[i])) {
|
||||
var item = val[i]
|
||||
if (!(item instanceof Buffer)) {
|
||||
var buf = Buffer.from(item.buffer, item.byteOffset, item.byteLength)
|
||||
if (buf.length === item.byteLength) {
|
||||
item = buf
|
||||
} else {
|
||||
item = buf.slice(item.byteOffset, item.byteOffset + item.byteLength)
|
||||
}
|
||||
}
|
||||
result += '\\\\x' + item.toString('hex')
|
||||
} else {
|
||||
result += escapeElement(prepareValue(val[i]))
|
||||
}
|
||||
}
|
||||
result = result + '}'
|
||||
return result
|
||||
}
|
||||
|
||||
// converts values from javascript types
|
||||
// to their 'raw' counterparts for use as a postgres parameter
|
||||
// note: you can override this function to provide your own conversion mechanism
|
||||
// for complex types, etc...
|
||||
var prepareValue = function (val, seen) {
|
||||
// null and undefined are both null for postgres
|
||||
if (val == null) {
|
||||
return null
|
||||
}
|
||||
if (val instanceof Buffer) {
|
||||
return val
|
||||
}
|
||||
if (ArrayBuffer.isView(val)) {
|
||||
var buf = Buffer.from(val.buffer, val.byteOffset, val.byteLength)
|
||||
if (buf.length === val.byteLength) {
|
||||
return buf
|
||||
}
|
||||
return buf.slice(val.byteOffset, val.byteOffset + val.byteLength) // Node.js v4 does not support those Buffer.from params
|
||||
}
|
||||
if (val instanceof Date) {
|
||||
if (defaults.parseInputDatesAsUTC) {
|
||||
return dateToStringUTC(val)
|
||||
} else {
|
||||
return dateToString(val)
|
||||
}
|
||||
}
|
||||
if (Array.isArray(val)) {
|
||||
return arrayString(val)
|
||||
}
|
||||
if (typeof val === 'object') {
|
||||
return prepareObject(val, seen)
|
||||
}
|
||||
return val.toString()
|
||||
}
|
||||
|
||||
function prepareObject(val, seen) {
|
||||
if (val && typeof val.toPostgres === 'function') {
|
||||
seen = seen || []
|
||||
if (seen.indexOf(val) !== -1) {
|
||||
throw new Error('circular reference detected while preparing "' + val + '" for query')
|
||||
}
|
||||
seen.push(val)
|
||||
|
||||
return prepareValue(val.toPostgres(prepareValue), seen)
|
||||
}
|
||||
return JSON.stringify(val)
|
||||
}
|
||||
|
||||
function pad(number, digits) {
|
||||
number = '' + number
|
||||
while (number.length < digits) {
|
||||
number = '0' + number
|
||||
}
|
||||
return number
|
||||
}
|
||||
|
||||
function dateToString(date) {
|
||||
var offset = -date.getTimezoneOffset()
|
||||
|
||||
var year = date.getFullYear()
|
||||
var isBCYear = year < 1
|
||||
if (isBCYear) year = Math.abs(year) + 1 // negative years are 1 off their BC representation
|
||||
|
||||
var ret =
|
||||
pad(year, 4) +
|
||||
'-' +
|
||||
pad(date.getMonth() + 1, 2) +
|
||||
'-' +
|
||||
pad(date.getDate(), 2) +
|
||||
'T' +
|
||||
pad(date.getHours(), 2) +
|
||||
':' +
|
||||
pad(date.getMinutes(), 2) +
|
||||
':' +
|
||||
pad(date.getSeconds(), 2) +
|
||||
'.' +
|
||||
pad(date.getMilliseconds(), 3)
|
||||
|
||||
if (offset < 0) {
|
||||
ret += '-'
|
||||
offset *= -1
|
||||
} else {
|
||||
ret += '+'
|
||||
}
|
||||
|
||||
ret += pad(Math.floor(offset / 60), 2) + ':' + pad(offset % 60, 2)
|
||||
if (isBCYear) ret += ' BC'
|
||||
return ret
|
||||
}
|
||||
|
||||
function dateToStringUTC(date) {
|
||||
var year = date.getUTCFullYear()
|
||||
var isBCYear = year < 1
|
||||
if (isBCYear) year = Math.abs(year) + 1 // negative years are 1 off their BC representation
|
||||
|
||||
var ret =
|
||||
pad(year, 4) +
|
||||
'-' +
|
||||
pad(date.getUTCMonth() + 1, 2) +
|
||||
'-' +
|
||||
pad(date.getUTCDate(), 2) +
|
||||
'T' +
|
||||
pad(date.getUTCHours(), 2) +
|
||||
':' +
|
||||
pad(date.getUTCMinutes(), 2) +
|
||||
':' +
|
||||
pad(date.getUTCSeconds(), 2) +
|
||||
'.' +
|
||||
pad(date.getUTCMilliseconds(), 3)
|
||||
|
||||
ret += '+00:00'
|
||||
if (isBCYear) ret += ' BC'
|
||||
return ret
|
||||
}
|
||||
|
||||
function normalizeQueryConfig(config, values, callback) {
|
||||
// can take in strings or config objects
|
||||
config = typeof config === 'string' ? { text: config } : config
|
||||
if (values) {
|
||||
if (typeof values === 'function') {
|
||||
config.callback = values
|
||||
} else {
|
||||
config.values = values
|
||||
}
|
||||
}
|
||||
if (callback) {
|
||||
config.callback = callback
|
||||
}
|
||||
return config
|
||||
}
|
||||
|
||||
// Ported from PostgreSQL 9.2.4 source code in src/interfaces/libpq/fe-exec.c
|
||||
const escapeIdentifier = function (str) {
|
||||
return '"' + str.replace(/"/g, '""') + '"'
|
||||
}
|
||||
|
||||
const escapeLiteral = function (str) {
|
||||
var hasBackslash = false
|
||||
var escaped = "'"
|
||||
|
||||
for (var i = 0; i < str.length; i++) {
|
||||
var c = str[i]
|
||||
if (c === "'") {
|
||||
escaped += c + c
|
||||
} else if (c === '\\') {
|
||||
escaped += c + c
|
||||
hasBackslash = true
|
||||
} else {
|
||||
escaped += c
|
||||
}
|
||||
}
|
||||
|
||||
escaped += "'"
|
||||
|
||||
if (hasBackslash === true) {
|
||||
escaped = ' E' + escaped
|
||||
}
|
||||
|
||||
return escaped
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
prepareValue: function prepareValueWrapper(value) {
|
||||
// this ensures that extra arguments do not get passed into prepareValue
|
||||
// by accident, eg: from calling values.map(utils.prepareValue)
|
||||
return prepareValue(value)
|
||||
},
|
||||
normalizeQueryConfig,
|
||||
escapeIdentifier,
|
||||
escapeLiteral,
|
||||
}
|
||||
7
node_modules/pg/node_modules/pg-types/.travis.yml
generated
vendored
Normal file
7
node_modules/pg/node_modules/pg-types/.travis.yml
generated
vendored
Normal file
@ -0,0 +1,7 @@
|
||||
language: node_js
|
||||
node_js:
|
||||
- '4'
|
||||
- 'lts/*'
|
||||
- 'node'
|
||||
env:
|
||||
- PGUSER=postgres
|
||||
14
node_modules/pg/node_modules/pg-types/Makefile
generated
vendored
Normal file
14
node_modules/pg/node_modules/pg-types/Makefile
generated
vendored
Normal file
@ -0,0 +1,14 @@
|
||||
.PHONY: publish-patch test
|
||||
|
||||
test:
|
||||
npm test
|
||||
|
||||
patch: test
|
||||
npm version patch -m "Bump version"
|
||||
git push origin master --tags
|
||||
npm publish
|
||||
|
||||
minor: test
|
||||
npm version minor -m "Bump version"
|
||||
git push origin master --tags
|
||||
npm publish
|
||||
75
node_modules/pg/node_modules/pg-types/README.md
generated
vendored
Normal file
75
node_modules/pg/node_modules/pg-types/README.md
generated
vendored
Normal file
@ -0,0 +1,75 @@
|
||||
# pg-types
|
||||
|
||||
This is the code that turns all the raw text from postgres into JavaScript types for [node-postgres](https://github.com/brianc/node-postgres.git)
|
||||
|
||||
## use
|
||||
|
||||
This module is consumed and exported from the root `pg` object of node-postgres. To access it, do the following:
|
||||
|
||||
```js
|
||||
var types = require('pg').types
|
||||
```
|
||||
|
||||
Generally what you'll want to do is override how a specific data-type is parsed and turned into a JavaScript type. By default the PostgreSQL backend server returns everything as strings. Every data type corresponds to a unique `OID` within the server, and these `OIDs` are sent back with the query response. So, you need to match a particluar `OID` to a function you'd like to use to take the raw text input and produce a valid JavaScript object as a result. `null` values are never parsed.
|
||||
|
||||
Let's do something I commonly like to do on projects: return 64-bit integers `(int8)` as JavaScript integers. Because JavaScript doesn't have support for 64-bit integers node-postgres cannot confidently parse `int8` data type results as numbers because if you have a _huge_ number it will overflow and the result you'd get back from node-postgres would not be the result in the datbase. That would be a __very bad thing__ so node-postgres just returns `int8` results as strings and leaves the parsing up to you. Let's say that you know you don't and wont ever have numbers greater than `int4` in your database, but you're tired of recieving results from the `COUNT(*)` function as strings (because that function returns `int8`). You would do this:
|
||||
|
||||
```js
|
||||
var types = require('pg').types
|
||||
types.setTypeParser(20, function(val) {
|
||||
return parseInt(val)
|
||||
})
|
||||
```
|
||||
|
||||
__boom__: now you get numbers instead of strings.
|
||||
|
||||
Just as another example -- not saying this is a good idea -- let's say you want to return all dates from your database as [moment](http://momentjs.com/docs/) objects. Okay, do this:
|
||||
|
||||
```js
|
||||
var types = require('pg').types
|
||||
var moment = require('moment')
|
||||
var parseFn = function(val) {
|
||||
return val === null ? null : moment(val)
|
||||
}
|
||||
types.setTypeParser(types.builtins.TIMESTAMPTZ, parseFn)
|
||||
types.setTypeParser(types.builtins.TIMESTAMP, parseFn)
|
||||
```
|
||||
_note: I've never done that with my dates, and I'm not 100% sure moment can parse all the date strings returned from postgres. It's just an example!_
|
||||
|
||||
If you're thinking "gee, this seems pretty handy, but how can I get a list of all the OIDs in the database and what they correspond to?!?!?!" worry not:
|
||||
|
||||
```bash
|
||||
$ psql -c "select typname, oid, typarray from pg_type order by oid"
|
||||
```
|
||||
|
||||
If you want to find out the OID of a specific type:
|
||||
|
||||
```bash
|
||||
$ psql -c "select typname, oid, typarray from pg_type where typname = 'daterange' order by oid"
|
||||
```
|
||||
|
||||
:smile:
|
||||
|
||||
## license
|
||||
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014 Brian M. Carlson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
137
node_modules/pg/node_modules/pg-types/index.d.ts
generated
vendored
Normal file
137
node_modules/pg/node_modules/pg-types/index.d.ts
generated
vendored
Normal file
@ -0,0 +1,137 @@
|
||||
export enum TypeId {
|
||||
BOOL = 16,
|
||||
BYTEA = 17,
|
||||
CHAR = 18,
|
||||
INT8 = 20,
|
||||
INT2 = 21,
|
||||
INT4 = 23,
|
||||
REGPROC = 24,
|
||||
TEXT = 25,
|
||||
OID = 26,
|
||||
TID = 27,
|
||||
XID = 28,
|
||||
CID = 29,
|
||||
JSON = 114,
|
||||
XML = 142,
|
||||
PG_NODE_TREE = 194,
|
||||
SMGR = 210,
|
||||
PATH = 602,
|
||||
POLYGON = 604,
|
||||
CIDR = 650,
|
||||
FLOAT4 = 700,
|
||||
FLOAT8 = 701,
|
||||
ABSTIME = 702,
|
||||
RELTIME = 703,
|
||||
TINTERVAL = 704,
|
||||
CIRCLE = 718,
|
||||
MACADDR8 = 774,
|
||||
MONEY = 790,
|
||||
MACADDR = 829,
|
||||
INET = 869,
|
||||
ACLITEM = 1033,
|
||||
BPCHAR = 1042,
|
||||
VARCHAR = 1043,
|
||||
DATE = 1082,
|
||||
TIME = 1083,
|
||||
TIMESTAMP = 1114,
|
||||
TIMESTAMPTZ = 1184,
|
||||
INTERVAL = 1186,
|
||||
TIMETZ = 1266,
|
||||
BIT = 1560,
|
||||
VARBIT = 1562,
|
||||
NUMERIC = 1700,
|
||||
REFCURSOR = 1790,
|
||||
REGPROCEDURE = 2202,
|
||||
REGOPER = 2203,
|
||||
REGOPERATOR = 2204,
|
||||
REGCLASS = 2205,
|
||||
REGTYPE = 2206,
|
||||
UUID = 2950,
|
||||
TXID_SNAPSHOT = 2970,
|
||||
PG_LSN = 3220,
|
||||
PG_NDISTINCT = 3361,
|
||||
PG_DEPENDENCIES = 3402,
|
||||
TSVECTOR = 3614,
|
||||
TSQUERY = 3615,
|
||||
GTSVECTOR = 3642,
|
||||
REGCONFIG = 3734,
|
||||
REGDICTIONARY = 3769,
|
||||
JSONB = 3802,
|
||||
REGNAMESPACE = 4089,
|
||||
REGROLE = 4096
|
||||
}
|
||||
|
||||
export type builtinsTypes =
|
||||
'BOOL' |
|
||||
'BYTEA' |
|
||||
'CHAR' |
|
||||
'INT8' |
|
||||
'INT2' |
|
||||
'INT4' |
|
||||
'REGPROC' |
|
||||
'TEXT' |
|
||||
'OID' |
|
||||
'TID' |
|
||||
'XID' |
|
||||
'CID' |
|
||||
'JSON' |
|
||||
'XML' |
|
||||
'PG_NODE_TREE' |
|
||||
'SMGR' |
|
||||
'PATH' |
|
||||
'POLYGON' |
|
||||
'CIDR' |
|
||||
'FLOAT4' |
|
||||
'FLOAT8' |
|
||||
'ABSTIME' |
|
||||
'RELTIME' |
|
||||
'TINTERVAL' |
|
||||
'CIRCLE' |
|
||||
'MACADDR8' |
|
||||
'MONEY' |
|
||||
'MACADDR' |
|
||||
'INET' |
|
||||
'ACLITEM' |
|
||||
'BPCHAR' |
|
||||
'VARCHAR' |
|
||||
'DATE' |
|
||||
'TIME' |
|
||||
'TIMESTAMP' |
|
||||
'TIMESTAMPTZ' |
|
||||
'INTERVAL' |
|
||||
'TIMETZ' |
|
||||
'BIT' |
|
||||
'VARBIT' |
|
||||
'NUMERIC' |
|
||||
'REFCURSOR' |
|
||||
'REGPROCEDURE' |
|
||||
'REGOPER' |
|
||||
'REGOPERATOR' |
|
||||
'REGCLASS' |
|
||||
'REGTYPE' |
|
||||
'UUID' |
|
||||
'TXID_SNAPSHOT' |
|
||||
'PG_LSN' |
|
||||
'PG_NDISTINCT' |
|
||||
'PG_DEPENDENCIES' |
|
||||
'TSVECTOR' |
|
||||
'TSQUERY' |
|
||||
'GTSVECTOR' |
|
||||
'REGCONFIG' |
|
||||
'REGDICTIONARY' |
|
||||
'JSONB' |
|
||||
'REGNAMESPACE' |
|
||||
'REGROLE';
|
||||
|
||||
export type TypesBuiltins = {[key in builtinsTypes]: TypeId};
|
||||
|
||||
export type TypeFormat = 'text' | 'binary';
|
||||
|
||||
export const builtins: TypesBuiltins;
|
||||
|
||||
export function setTypeParser (id: TypeId, parseFn: ((value: string) => any)): void;
|
||||
export function setTypeParser (id: TypeId, format: TypeFormat, parseFn: (value: string) => any): void;
|
||||
|
||||
export const getTypeParser: (id: TypeId, format?: TypeFormat) => any
|
||||
|
||||
export const arrayParser: (source: string, transform: (entry: any) => any) => any[];
|
||||
47
node_modules/pg/node_modules/pg-types/index.js
generated
vendored
Normal file
47
node_modules/pg/node_modules/pg-types/index.js
generated
vendored
Normal file
@ -0,0 +1,47 @@
|
||||
var textParsers = require('./lib/textParsers');
|
||||
var binaryParsers = require('./lib/binaryParsers');
|
||||
var arrayParser = require('./lib/arrayParser');
|
||||
var builtinTypes = require('./lib/builtins');
|
||||
|
||||
exports.getTypeParser = getTypeParser;
|
||||
exports.setTypeParser = setTypeParser;
|
||||
exports.arrayParser = arrayParser;
|
||||
exports.builtins = builtinTypes;
|
||||
|
||||
var typeParsers = {
|
||||
text: {},
|
||||
binary: {}
|
||||
};
|
||||
|
||||
//the empty parse function
|
||||
function noParse (val) {
|
||||
return String(val);
|
||||
};
|
||||
|
||||
//returns a function used to convert a specific type (specified by
|
||||
//oid) into a result javascript type
|
||||
//note: the oid can be obtained via the following sql query:
|
||||
//SELECT oid FROM pg_type WHERE typname = 'TYPE_NAME_HERE';
|
||||
function getTypeParser (oid, format) {
|
||||
format = format || 'text';
|
||||
if (!typeParsers[format]) {
|
||||
return noParse;
|
||||
}
|
||||
return typeParsers[format][oid] || noParse;
|
||||
};
|
||||
|
||||
function setTypeParser (oid, format, parseFn) {
|
||||
if(typeof format == 'function') {
|
||||
parseFn = format;
|
||||
format = 'text';
|
||||
}
|
||||
typeParsers[format][oid] = parseFn;
|
||||
};
|
||||
|
||||
textParsers.init(function(oid, converter) {
|
||||
typeParsers.text[oid] = converter;
|
||||
});
|
||||
|
||||
binaryParsers.init(function(oid, converter) {
|
||||
typeParsers.binary[oid] = converter;
|
||||
});
|
||||
21
node_modules/pg/node_modules/pg-types/index.test-d.ts
generated
vendored
Normal file
21
node_modules/pg/node_modules/pg-types/index.test-d.ts
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
import * as types from '.';
|
||||
import { expectType } from 'tsd';
|
||||
|
||||
// builtins
|
||||
expectType<types.TypesBuiltins>(types.builtins);
|
||||
|
||||
// getTypeParser
|
||||
const noParse = types.getTypeParser(types.builtins.NUMERIC, 'text');
|
||||
const numericParser = types.getTypeParser(types.builtins.NUMERIC, 'binary');
|
||||
expectType<string>(noParse('noParse'));
|
||||
expectType<number>(numericParser([200, 1, 0, 15]));
|
||||
|
||||
// getArrayParser
|
||||
const value = types.arrayParser('{1,2,3}', (num) => parseInt(num));
|
||||
expectType<number[]>(value);
|
||||
|
||||
//setTypeParser
|
||||
types.setTypeParser(types.builtins.INT8, parseInt);
|
||||
types.setTypeParser(types.builtins.FLOAT8, parseFloat);
|
||||
types.setTypeParser(types.builtins.FLOAT8, 'binary', (data) => data[0]);
|
||||
types.setTypeParser(types.builtins.FLOAT8, 'text', parseFloat);
|
||||
11
node_modules/pg/node_modules/pg-types/lib/arrayParser.js
generated
vendored
Normal file
11
node_modules/pg/node_modules/pg-types/lib/arrayParser.js
generated
vendored
Normal file
@ -0,0 +1,11 @@
|
||||
var array = require('postgres-array');
|
||||
|
||||
module.exports = {
|
||||
create: function (source, transform) {
|
||||
return {
|
||||
parse: function() {
|
||||
return array.parse(source, transform);
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
257
node_modules/pg/node_modules/pg-types/lib/binaryParsers.js
generated
vendored
Normal file
257
node_modules/pg/node_modules/pg-types/lib/binaryParsers.js
generated
vendored
Normal file
@ -0,0 +1,257 @@
|
||||
var parseInt64 = require('pg-int8');
|
||||
|
||||
var parseBits = function(data, bits, offset, invert, callback) {
|
||||
offset = offset || 0;
|
||||
invert = invert || false;
|
||||
callback = callback || function(lastValue, newValue, bits) { return (lastValue * Math.pow(2, bits)) + newValue; };
|
||||
var offsetBytes = offset >> 3;
|
||||
|
||||
var inv = function(value) {
|
||||
if (invert) {
|
||||
return ~value & 0xff;
|
||||
}
|
||||
|
||||
return value;
|
||||
};
|
||||
|
||||
// read first (maybe partial) byte
|
||||
var mask = 0xff;
|
||||
var firstBits = 8 - (offset % 8);
|
||||
if (bits < firstBits) {
|
||||
mask = (0xff << (8 - bits)) & 0xff;
|
||||
firstBits = bits;
|
||||
}
|
||||
|
||||
if (offset) {
|
||||
mask = mask >> (offset % 8);
|
||||
}
|
||||
|
||||
var result = 0;
|
||||
if ((offset % 8) + bits >= 8) {
|
||||
result = callback(0, inv(data[offsetBytes]) & mask, firstBits);
|
||||
}
|
||||
|
||||
// read bytes
|
||||
var bytes = (bits + offset) >> 3;
|
||||
for (var i = offsetBytes + 1; i < bytes; i++) {
|
||||
result = callback(result, inv(data[i]), 8);
|
||||
}
|
||||
|
||||
// bits to read, that are not a complete byte
|
||||
var lastBits = (bits + offset) % 8;
|
||||
if (lastBits > 0) {
|
||||
result = callback(result, inv(data[bytes]) >> (8 - lastBits), lastBits);
|
||||
}
|
||||
|
||||
return result;
|
||||
};
|
||||
|
||||
var parseFloatFromBits = function(data, precisionBits, exponentBits) {
|
||||
var bias = Math.pow(2, exponentBits - 1) - 1;
|
||||
var sign = parseBits(data, 1);
|
||||
var exponent = parseBits(data, exponentBits, 1);
|
||||
|
||||
if (exponent === 0) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
// parse mantissa
|
||||
var precisionBitsCounter = 1;
|
||||
var parsePrecisionBits = function(lastValue, newValue, bits) {
|
||||
if (lastValue === 0) {
|
||||
lastValue = 1;
|
||||
}
|
||||
|
||||
for (var i = 1; i <= bits; i++) {
|
||||
precisionBitsCounter /= 2;
|
||||
if ((newValue & (0x1 << (bits - i))) > 0) {
|
||||
lastValue += precisionBitsCounter;
|
||||
}
|
||||
}
|
||||
|
||||
return lastValue;
|
||||
};
|
||||
|
||||
var mantissa = parseBits(data, precisionBits, exponentBits + 1, false, parsePrecisionBits);
|
||||
|
||||
// special cases
|
||||
if (exponent == (Math.pow(2, exponentBits + 1) - 1)) {
|
||||
if (mantissa === 0) {
|
||||
return (sign === 0) ? Infinity : -Infinity;
|
||||
}
|
||||
|
||||
return NaN;
|
||||
}
|
||||
|
||||
// normale number
|
||||
return ((sign === 0) ? 1 : -1) * Math.pow(2, exponent - bias) * mantissa;
|
||||
};
|
||||
|
||||
var parseInt16 = function(value) {
|
||||
if (parseBits(value, 1) == 1) {
|
||||
return -1 * (parseBits(value, 15, 1, true) + 1);
|
||||
}
|
||||
|
||||
return parseBits(value, 15, 1);
|
||||
};
|
||||
|
||||
var parseInt32 = function(value) {
|
||||
if (parseBits(value, 1) == 1) {
|
||||
return -1 * (parseBits(value, 31, 1, true) + 1);
|
||||
}
|
||||
|
||||
return parseBits(value, 31, 1);
|
||||
};
|
||||
|
||||
var parseFloat32 = function(value) {
|
||||
return parseFloatFromBits(value, 23, 8);
|
||||
};
|
||||
|
||||
var parseFloat64 = function(value) {
|
||||
return parseFloatFromBits(value, 52, 11);
|
||||
};
|
||||
|
||||
var parseNumeric = function(value) {
|
||||
var sign = parseBits(value, 16, 32);
|
||||
if (sign == 0xc000) {
|
||||
return NaN;
|
||||
}
|
||||
|
||||
var weight = Math.pow(10000, parseBits(value, 16, 16));
|
||||
var result = 0;
|
||||
|
||||
var digits = [];
|
||||
var ndigits = parseBits(value, 16);
|
||||
for (var i = 0; i < ndigits; i++) {
|
||||
result += parseBits(value, 16, 64 + (16 * i)) * weight;
|
||||
weight /= 10000;
|
||||
}
|
||||
|
||||
var scale = Math.pow(10, parseBits(value, 16, 48));
|
||||
return ((sign === 0) ? 1 : -1) * Math.round(result * scale) / scale;
|
||||
};
|
||||
|
||||
var parseDate = function(isUTC, value) {
|
||||
var sign = parseBits(value, 1);
|
||||
var rawValue = parseBits(value, 63, 1);
|
||||
|
||||
// discard usecs and shift from 2000 to 1970
|
||||
var result = new Date((((sign === 0) ? 1 : -1) * rawValue / 1000) + 946684800000);
|
||||
|
||||
if (!isUTC) {
|
||||
result.setTime(result.getTime() + result.getTimezoneOffset() * 60000);
|
||||
}
|
||||
|
||||
// add microseconds to the date
|
||||
result.usec = rawValue % 1000;
|
||||
result.getMicroSeconds = function() {
|
||||
return this.usec;
|
||||
};
|
||||
result.setMicroSeconds = function(value) {
|
||||
this.usec = value;
|
||||
};
|
||||
result.getUTCMicroSeconds = function() {
|
||||
return this.usec;
|
||||
};
|
||||
|
||||
return result;
|
||||
};
|
||||
|
||||
var parseArray = function(value) {
|
||||
var dim = parseBits(value, 32);
|
||||
|
||||
var flags = parseBits(value, 32, 32);
|
||||
var elementType = parseBits(value, 32, 64);
|
||||
|
||||
var offset = 96;
|
||||
var dims = [];
|
||||
for (var i = 0; i < dim; i++) {
|
||||
// parse dimension
|
||||
dims[i] = parseBits(value, 32, offset);
|
||||
offset += 32;
|
||||
|
||||
// ignore lower bounds
|
||||
offset += 32;
|
||||
}
|
||||
|
||||
var parseElement = function(elementType) {
|
||||
// parse content length
|
||||
var length = parseBits(value, 32, offset);
|
||||
offset += 32;
|
||||
|
||||
// parse null values
|
||||
if (length == 0xffffffff) {
|
||||
return null;
|
||||
}
|
||||
|
||||
var result;
|
||||
if ((elementType == 0x17) || (elementType == 0x14)) {
|
||||
// int/bigint
|
||||
result = parseBits(value, length * 8, offset);
|
||||
offset += length * 8;
|
||||
return result;
|
||||
}
|
||||
else if (elementType == 0x19) {
|
||||
// string
|
||||
result = value.toString(this.encoding, offset >> 3, (offset += (length << 3)) >> 3);
|
||||
return result;
|
||||
}
|
||||
else {
|
||||
console.log("ERROR: ElementType not implemented: " + elementType);
|
||||
}
|
||||
};
|
||||
|
||||
var parse = function(dimension, elementType) {
|
||||
var array = [];
|
||||
var i;
|
||||
|
||||
if (dimension.length > 1) {
|
||||
var count = dimension.shift();
|
||||
for (i = 0; i < count; i++) {
|
||||
array[i] = parse(dimension, elementType);
|
||||
}
|
||||
dimension.unshift(count);
|
||||
}
|
||||
else {
|
||||
for (i = 0; i < dimension[0]; i++) {
|
||||
array[i] = parseElement(elementType);
|
||||
}
|
||||
}
|
||||
|
||||
return array;
|
||||
};
|
||||
|
||||
return parse(dims, elementType);
|
||||
};
|
||||
|
||||
var parseText = function(value) {
|
||||
return value.toString('utf8');
|
||||
};
|
||||
|
||||
var parseBool = function(value) {
|
||||
if(value === null) return null;
|
||||
return (parseBits(value, 8) > 0);
|
||||
};
|
||||
|
||||
var init = function(register) {
|
||||
register(20, parseInt64);
|
||||
register(21, parseInt16);
|
||||
register(23, parseInt32);
|
||||
register(26, parseInt32);
|
||||
register(1700, parseNumeric);
|
||||
register(700, parseFloat32);
|
||||
register(701, parseFloat64);
|
||||
register(16, parseBool);
|
||||
register(1114, parseDate.bind(null, false));
|
||||
register(1184, parseDate.bind(null, true));
|
||||
register(1000, parseArray);
|
||||
register(1007, parseArray);
|
||||
register(1016, parseArray);
|
||||
register(1008, parseArray);
|
||||
register(1009, parseArray);
|
||||
register(25, parseText);
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
init: init
|
||||
};
|
||||
73
node_modules/pg/node_modules/pg-types/lib/builtins.js
generated
vendored
Normal file
73
node_modules/pg/node_modules/pg-types/lib/builtins.js
generated
vendored
Normal file
@ -0,0 +1,73 @@
|
||||
/**
|
||||
* Following query was used to generate this file:
|
||||
|
||||
SELECT json_object_agg(UPPER(PT.typname), PT.oid::int4 ORDER BY pt.oid)
|
||||
FROM pg_type PT
|
||||
WHERE typnamespace = (SELECT pgn.oid FROM pg_namespace pgn WHERE nspname = 'pg_catalog') -- Take only builting Postgres types with stable OID (extension types are not guaranted to be stable)
|
||||
AND typtype = 'b' -- Only basic types
|
||||
AND typelem = 0 -- Ignore aliases
|
||||
AND typisdefined -- Ignore undefined types
|
||||
*/
|
||||
|
||||
module.exports = {
|
||||
BOOL: 16,
|
||||
BYTEA: 17,
|
||||
CHAR: 18,
|
||||
INT8: 20,
|
||||
INT2: 21,
|
||||
INT4: 23,
|
||||
REGPROC: 24,
|
||||
TEXT: 25,
|
||||
OID: 26,
|
||||
TID: 27,
|
||||
XID: 28,
|
||||
CID: 29,
|
||||
JSON: 114,
|
||||
XML: 142,
|
||||
PG_NODE_TREE: 194,
|
||||
SMGR: 210,
|
||||
PATH: 602,
|
||||
POLYGON: 604,
|
||||
CIDR: 650,
|
||||
FLOAT4: 700,
|
||||
FLOAT8: 701,
|
||||
ABSTIME: 702,
|
||||
RELTIME: 703,
|
||||
TINTERVAL: 704,
|
||||
CIRCLE: 718,
|
||||
MACADDR8: 774,
|
||||
MONEY: 790,
|
||||
MACADDR: 829,
|
||||
INET: 869,
|
||||
ACLITEM: 1033,
|
||||
BPCHAR: 1042,
|
||||
VARCHAR: 1043,
|
||||
DATE: 1082,
|
||||
TIME: 1083,
|
||||
TIMESTAMP: 1114,
|
||||
TIMESTAMPTZ: 1184,
|
||||
INTERVAL: 1186,
|
||||
TIMETZ: 1266,
|
||||
BIT: 1560,
|
||||
VARBIT: 1562,
|
||||
NUMERIC: 1700,
|
||||
REFCURSOR: 1790,
|
||||
REGPROCEDURE: 2202,
|
||||
REGOPER: 2203,
|
||||
REGOPERATOR: 2204,
|
||||
REGCLASS: 2205,
|
||||
REGTYPE: 2206,
|
||||
UUID: 2950,
|
||||
TXID_SNAPSHOT: 2970,
|
||||
PG_LSN: 3220,
|
||||
PG_NDISTINCT: 3361,
|
||||
PG_DEPENDENCIES: 3402,
|
||||
TSVECTOR: 3614,
|
||||
TSQUERY: 3615,
|
||||
GTSVECTOR: 3642,
|
||||
REGCONFIG: 3734,
|
||||
REGDICTIONARY: 3769,
|
||||
JSONB: 3802,
|
||||
REGNAMESPACE: 4089,
|
||||
REGROLE: 4096
|
||||
};
|
||||
215
node_modules/pg/node_modules/pg-types/lib/textParsers.js
generated
vendored
Normal file
215
node_modules/pg/node_modules/pg-types/lib/textParsers.js
generated
vendored
Normal file
@ -0,0 +1,215 @@
|
||||
var array = require('postgres-array')
|
||||
var arrayParser = require('./arrayParser');
|
||||
var parseDate = require('postgres-date');
|
||||
var parseInterval = require('postgres-interval');
|
||||
var parseByteA = require('postgres-bytea');
|
||||
|
||||
function allowNull (fn) {
|
||||
return function nullAllowed (value) {
|
||||
if (value === null) return value
|
||||
return fn(value)
|
||||
}
|
||||
}
|
||||
|
||||
function parseBool (value) {
|
||||
if (value === null) return value
|
||||
return value === 'TRUE' ||
|
||||
value === 't' ||
|
||||
value === 'true' ||
|
||||
value === 'y' ||
|
||||
value === 'yes' ||
|
||||
value === 'on' ||
|
||||
value === '1';
|
||||
}
|
||||
|
||||
function parseBoolArray (value) {
|
||||
if (!value) return null
|
||||
return array.parse(value, parseBool)
|
||||
}
|
||||
|
||||
function parseBaseTenInt (string) {
|
||||
return parseInt(string, 10)
|
||||
}
|
||||
|
||||
function parseIntegerArray (value) {
|
||||
if (!value) return null
|
||||
return array.parse(value, allowNull(parseBaseTenInt))
|
||||
}
|
||||
|
||||
function parseBigIntegerArray (value) {
|
||||
if (!value) return null
|
||||
return array.parse(value, allowNull(function (entry) {
|
||||
return parseBigInteger(entry).trim()
|
||||
}))
|
||||
}
|
||||
|
||||
var parsePointArray = function(value) {
|
||||
if(!value) { return null; }
|
||||
var p = arrayParser.create(value, function(entry) {
|
||||
if(entry !== null) {
|
||||
entry = parsePoint(entry);
|
||||
}
|
||||
return entry;
|
||||
});
|
||||
|
||||
return p.parse();
|
||||
};
|
||||
|
||||
var parseFloatArray = function(value) {
|
||||
if(!value) { return null; }
|
||||
var p = arrayParser.create(value, function(entry) {
|
||||
if(entry !== null) {
|
||||
entry = parseFloat(entry);
|
||||
}
|
||||
return entry;
|
||||
});
|
||||
|
||||
return p.parse();
|
||||
};
|
||||
|
||||
var parseStringArray = function(value) {
|
||||
if(!value) { return null; }
|
||||
|
||||
var p = arrayParser.create(value);
|
||||
return p.parse();
|
||||
};
|
||||
|
||||
var parseDateArray = function(value) {
|
||||
if (!value) { return null; }
|
||||
|
||||
var p = arrayParser.create(value, function(entry) {
|
||||
if (entry !== null) {
|
||||
entry = parseDate(entry);
|
||||
}
|
||||
return entry;
|
||||
});
|
||||
|
||||
return p.parse();
|
||||
};
|
||||
|
||||
var parseIntervalArray = function(value) {
|
||||
if (!value) { return null; }
|
||||
|
||||
var p = arrayParser.create(value, function(entry) {
|
||||
if (entry !== null) {
|
||||
entry = parseInterval(entry);
|
||||
}
|
||||
return entry;
|
||||
});
|
||||
|
||||
return p.parse();
|
||||
};
|
||||
|
||||
var parseByteAArray = function(value) {
|
||||
if (!value) { return null; }
|
||||
|
||||
return array.parse(value, allowNull(parseByteA));
|
||||
};
|
||||
|
||||
var parseInteger = function(value) {
|
||||
return parseInt(value, 10);
|
||||
};
|
||||
|
||||
var parseBigInteger = function(value) {
|
||||
var valStr = String(value);
|
||||
if (/^\d+$/.test(valStr)) { return valStr; }
|
||||
return value;
|
||||
};
|
||||
|
||||
var parseJsonArray = function(value) {
|
||||
if (!value) { return null; }
|
||||
|
||||
return array.parse(value, allowNull(JSON.parse));
|
||||
};
|
||||
|
||||
var parsePoint = function(value) {
|
||||
if (value[0] !== '(') { return null; }
|
||||
|
||||
value = value.substring( 1, value.length - 1 ).split(',');
|
||||
|
||||
return {
|
||||
x: parseFloat(value[0])
|
||||
, y: parseFloat(value[1])
|
||||
};
|
||||
};
|
||||
|
||||
var parseCircle = function(value) {
|
||||
if (value[0] !== '<' && value[1] !== '(') { return null; }
|
||||
|
||||
var point = '(';
|
||||
var radius = '';
|
||||
var pointParsed = false;
|
||||
for (var i = 2; i < value.length - 1; i++){
|
||||
if (!pointParsed) {
|
||||
point += value[i];
|
||||
}
|
||||
|
||||
if (value[i] === ')') {
|
||||
pointParsed = true;
|
||||
continue;
|
||||
} else if (!pointParsed) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (value[i] === ','){
|
||||
continue;
|
||||
}
|
||||
|
||||
radius += value[i];
|
||||
}
|
||||
var result = parsePoint(point);
|
||||
result.radius = parseFloat(radius);
|
||||
|
||||
return result;
|
||||
};
|
||||
|
||||
var init = function(register) {
|
||||
register(20, parseBigInteger); // int8
|
||||
register(21, parseInteger); // int2
|
||||
register(23, parseInteger); // int4
|
||||
register(26, parseInteger); // oid
|
||||
register(700, parseFloat); // float4/real
|
||||
register(701, parseFloat); // float8/double
|
||||
register(16, parseBool);
|
||||
register(1082, parseDate); // date
|
||||
register(1114, parseDate); // timestamp without timezone
|
||||
register(1184, parseDate); // timestamp
|
||||
register(600, parsePoint); // point
|
||||
register(651, parseStringArray); // cidr[]
|
||||
register(718, parseCircle); // circle
|
||||
register(1000, parseBoolArray);
|
||||
register(1001, parseByteAArray);
|
||||
register(1005, parseIntegerArray); // _int2
|
||||
register(1007, parseIntegerArray); // _int4
|
||||
register(1028, parseIntegerArray); // oid[]
|
||||
register(1016, parseBigIntegerArray); // _int8
|
||||
register(1017, parsePointArray); // point[]
|
||||
register(1021, parseFloatArray); // _float4
|
||||
register(1022, parseFloatArray); // _float8
|
||||
register(1231, parseFloatArray); // _numeric
|
||||
register(1014, parseStringArray); //char
|
||||
register(1015, parseStringArray); //varchar
|
||||
register(1008, parseStringArray);
|
||||
register(1009, parseStringArray);
|
||||
register(1040, parseStringArray); // macaddr[]
|
||||
register(1041, parseStringArray); // inet[]
|
||||
register(1115, parseDateArray); // timestamp without time zone[]
|
||||
register(1182, parseDateArray); // _date
|
||||
register(1185, parseDateArray); // timestamp with time zone[]
|
||||
register(1186, parseInterval);
|
||||
register(1187, parseIntervalArray);
|
||||
register(17, parseByteA);
|
||||
register(114, JSON.parse.bind(JSON)); // json
|
||||
register(3802, JSON.parse.bind(JSON)); // jsonb
|
||||
register(199, parseJsonArray); // json[]
|
||||
register(3807, parseJsonArray); // jsonb[]
|
||||
register(3907, parseStringArray); // numrange[]
|
||||
register(2951, parseStringArray); // uuid[]
|
||||
register(791, parseStringArray); // money[]
|
||||
register(1183, parseStringArray); // time[]
|
||||
register(1270, parseStringArray); // timetz[]
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
init: init
|
||||
};
|
||||
42
node_modules/pg/node_modules/pg-types/package.json
generated
vendored
Normal file
42
node_modules/pg/node_modules/pg-types/package.json
generated
vendored
Normal file
@ -0,0 +1,42 @@
|
||||
{
|
||||
"name": "pg-types",
|
||||
"version": "2.2.0",
|
||||
"description": "Query result type converters for node-postgres",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "tape test/*.js | tap-spec && npm run test-ts",
|
||||
"test-ts": "if-node-version '>= 8' tsd"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/brianc/node-pg-types.git"
|
||||
},
|
||||
"keywords": [
|
||||
"postgres",
|
||||
"PostgreSQL",
|
||||
"pg"
|
||||
],
|
||||
"author": "Brian M. Carlson",
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/brianc/node-pg-types/issues"
|
||||
},
|
||||
"homepage": "https://github.com/brianc/node-pg-types",
|
||||
"devDependencies": {
|
||||
"if-node-version": "^1.1.1",
|
||||
"pff": "^1.0.0",
|
||||
"tap-spec": "^4.0.0",
|
||||
"tape": "^4.0.0",
|
||||
"tsd": "^0.7.4"
|
||||
},
|
||||
"dependencies": {
|
||||
"pg-int8": "1.0.1",
|
||||
"postgres-array": "~2.0.0",
|
||||
"postgres-bytea": "~1.0.0",
|
||||
"postgres-date": "~1.0.4",
|
||||
"postgres-interval": "^1.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
}
|
||||
24
node_modules/pg/node_modules/pg-types/test/index.js
generated
vendored
Normal file
24
node_modules/pg/node_modules/pg-types/test/index.js
generated
vendored
Normal file
@ -0,0 +1,24 @@
|
||||
|
||||
var test = require('tape')
|
||||
var printf = require('pff')
|
||||
var getTypeParser = require('../').getTypeParser
|
||||
var types = require('./types')
|
||||
|
||||
test('types', function (t) {
|
||||
Object.keys(types).forEach(function (typeName) {
|
||||
var type = types[typeName]
|
||||
t.test(typeName, function (t) {
|
||||
var parser = getTypeParser(type.id, type.format)
|
||||
type.tests.forEach(function (tests) {
|
||||
var input = tests[0]
|
||||
var expected = tests[1]
|
||||
var result = parser(input)
|
||||
if (typeof expected === 'function') {
|
||||
return expected(t, result)
|
||||
}
|
||||
t.equal(result, expected)
|
||||
})
|
||||
t.end()
|
||||
})
|
||||
})
|
||||
})
|
||||
597
node_modules/pg/node_modules/pg-types/test/types.js
generated
vendored
Normal file
597
node_modules/pg/node_modules/pg-types/test/types.js
generated
vendored
Normal file
@ -0,0 +1,597 @@
|
||||
'use strict'
|
||||
|
||||
exports['string/varchar'] = {
|
||||
format: 'text',
|
||||
id: 1043,
|
||||
tests: [
|
||||
['bang', 'bang']
|
||||
]
|
||||
}
|
||||
|
||||
exports['integer/int4'] = {
|
||||
format: 'text',
|
||||
id: 23,
|
||||
tests: [
|
||||
['2147483647', 2147483647]
|
||||
]
|
||||
}
|
||||
|
||||
exports['smallint/int2'] = {
|
||||
format: 'text',
|
||||
id: 21,
|
||||
tests: [
|
||||
['32767', 32767]
|
||||
]
|
||||
}
|
||||
|
||||
exports['bigint/int8'] = {
|
||||
format: 'text',
|
||||
id: 20,
|
||||
tests: [
|
||||
['9223372036854775807', '9223372036854775807']
|
||||
]
|
||||
}
|
||||
|
||||
exports.oid = {
|
||||
format: 'text',
|
||||
id: 26,
|
||||
tests: [
|
||||
['103', 103]
|
||||
]
|
||||
}
|
||||
|
||||
var bignum = '31415926535897932384626433832795028841971693993751058.16180339887498948482045868343656381177203091798057628'
|
||||
exports.numeric = {
|
||||
format: 'text',
|
||||
id: 1700,
|
||||
tests: [
|
||||
[bignum, bignum]
|
||||
]
|
||||
}
|
||||
|
||||
exports['real/float4'] = {
|
||||
format: 'text',
|
||||
id: 700,
|
||||
tests: [
|
||||
['123.456', 123.456]
|
||||
]
|
||||
}
|
||||
|
||||
exports['double precision / float 8'] = {
|
||||
format: 'text',
|
||||
id: 701,
|
||||
tests: [
|
||||
['12345678.12345678', 12345678.12345678]
|
||||
]
|
||||
}
|
||||
|
||||
exports.boolean = {
|
||||
format: 'text',
|
||||
id: 16,
|
||||
tests: [
|
||||
['TRUE', true],
|
||||
['t', true],
|
||||
['true', true],
|
||||
['y', true],
|
||||
['yes', true],
|
||||
['on', true],
|
||||
['1', true],
|
||||
['f', false],
|
||||
[null, null]
|
||||
]
|
||||
}
|
||||
|
||||
exports.timestamptz = {
|
||||
format: 'text',
|
||||
id: 1184,
|
||||
tests: [
|
||||
[
|
||||
'2010-10-31 14:54:13.74-05:30',
|
||||
dateEquals(2010, 9, 31, 20, 24, 13, 740)
|
||||
],
|
||||
[
|
||||
'2011-01-23 22:05:00.68-06',
|
||||
dateEquals(2011, 0, 24, 4, 5, 0, 680)
|
||||
],
|
||||
[
|
||||
'2010-10-30 14:11:12.730838Z',
|
||||
dateEquals(2010, 9, 30, 14, 11, 12, 730)
|
||||
],
|
||||
[
|
||||
'2010-10-30 13:10:01+05',
|
||||
dateEquals(2010, 9, 30, 8, 10, 1, 0)
|
||||
]
|
||||
]
|
||||
}
|
||||
|
||||
exports.timestamp = {
|
||||
format: 'text',
|
||||
id: 1114,
|
||||
tests: [
|
||||
[
|
||||
'2010-10-31 00:00:00',
|
||||
function (t, value) {
|
||||
t.equal(
|
||||
value.toUTCString(),
|
||||
new Date(2010, 9, 31, 0, 0, 0, 0, 0).toUTCString()
|
||||
)
|
||||
t.equal(
|
||||
value.toString(),
|
||||
new Date(2010, 9, 31, 0, 0, 0, 0, 0, 0).toString()
|
||||
)
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
|
||||
exports.date = {
|
||||
format: 'text',
|
||||
id: 1082,
|
||||
tests: [
|
||||
['2010-10-31', function (t, value) {
|
||||
var now = new Date(2010, 9, 31)
|
||||
dateEquals(
|
||||
2010,
|
||||
now.getUTCMonth(),
|
||||
now.getUTCDate(),
|
||||
now.getUTCHours(), 0, 0, 0)(t, value)
|
||||
t.equal(value.getHours(), now.getHours())
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports.inet = {
|
||||
format: 'text',
|
||||
id: 869,
|
||||
tests: [
|
||||
['8.8.8.8', '8.8.8.8'],
|
||||
['2001:4860:4860::8888', '2001:4860:4860::8888'],
|
||||
['127.0.0.1', '127.0.0.1'],
|
||||
['fd00:1::40e', 'fd00:1::40e'],
|
||||
['1.2.3.4', '1.2.3.4']
|
||||
]
|
||||
}
|
||||
|
||||
exports.cidr = {
|
||||
format: 'text',
|
||||
id: 650,
|
||||
tests: [
|
||||
['172.16.0.0/12', '172.16.0.0/12'],
|
||||
['fe80::/10', 'fe80::/10'],
|
||||
['fc00::/7', 'fc00::/7'],
|
||||
['192.168.0.0/24', '192.168.0.0/24'],
|
||||
['10.0.0.0/8', '10.0.0.0/8']
|
||||
]
|
||||
}
|
||||
|
||||
exports.macaddr = {
|
||||
format: 'text',
|
||||
id: 829,
|
||||
tests: [
|
||||
['08:00:2b:01:02:03', '08:00:2b:01:02:03'],
|
||||
['16:10:9f:0d:66:00', '16:10:9f:0d:66:00']
|
||||
]
|
||||
}
|
||||
|
||||
exports.numrange = {
|
||||
format: 'text',
|
||||
id: 3906,
|
||||
tests: [
|
||||
['[,]', '[,]'],
|
||||
['(,)', '(,)'],
|
||||
['(,]', '(,]'],
|
||||
['[1,)', '[1,)'],
|
||||
['[,1]', '[,1]'],
|
||||
['(1,2)', '(1,2)'],
|
||||
['(1,20.5]', '(1,20.5]']
|
||||
]
|
||||
}
|
||||
|
||||
exports.interval = {
|
||||
format: 'text',
|
||||
id: 1186,
|
||||
tests: [
|
||||
['01:02:03', function (t, value) {
|
||||
t.equal(value.toPostgres(), '3 seconds 2 minutes 1 hours')
|
||||
t.deepEqual(value, {hours: 1, minutes: 2, seconds: 3})
|
||||
}],
|
||||
['01:02:03.456', function (t, value) {
|
||||
t.deepEqual(value, {hours: 1, minutes:2, seconds: 3, milliseconds: 456})
|
||||
}],
|
||||
['1 year -32 days', function (t, value) {
|
||||
t.equal(value.toPostgres(), '-32 days 1 years')
|
||||
t.deepEqual(value, {years: 1, days: -32})
|
||||
}],
|
||||
['1 day -00:00:03', function (t, value) {
|
||||
t.equal(value.toPostgres(), '-3 seconds 1 days')
|
||||
t.deepEqual(value, {days: 1, seconds: -3})
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports.bytea = {
|
||||
format: 'text',
|
||||
id: 17,
|
||||
tests: [
|
||||
['foo\\000\\200\\\\\\377', function (t, value) {
|
||||
var buffer = new Buffer([102, 111, 111, 0, 128, 92, 255])
|
||||
t.ok(buffer.equals(value))
|
||||
}],
|
||||
['', function (t, value) {
|
||||
var buffer = new Buffer(0)
|
||||
t.ok(buffer.equals(value))
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/boolean'] = {
|
||||
format: 'text',
|
||||
id: 1000,
|
||||
tests: [
|
||||
['{true,false}', function (t, value) {
|
||||
t.deepEqual(value, [true, false])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/char'] = {
|
||||
format: 'text',
|
||||
id: 1014,
|
||||
tests: [
|
||||
['{foo,bar}', function (t, value) {
|
||||
t.deepEqual(value, ['foo', 'bar'])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/varchar'] = {
|
||||
format: 'text',
|
||||
id: 1015,
|
||||
tests: [
|
||||
['{foo,bar}', function (t, value) {
|
||||
t.deepEqual(value, ['foo', 'bar'])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/text'] = {
|
||||
format: 'text',
|
||||
id: 1008,
|
||||
tests: [
|
||||
['{foo}', function (t, value) {
|
||||
t.deepEqual(value, ['foo'])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/bytea'] = {
|
||||
format: 'text',
|
||||
id: 1001,
|
||||
tests: [
|
||||
['{"\\\\x00000000"}', function (t, value) {
|
||||
var buffer = new Buffer('00000000', 'hex')
|
||||
t.ok(Array.isArray(value))
|
||||
t.equal(value.length, 1)
|
||||
t.ok(buffer.equals(value[0]))
|
||||
}],
|
||||
['{NULL,"\\\\x4e554c4c"}', function (t, value) {
|
||||
var buffer = new Buffer('4e554c4c', 'hex')
|
||||
t.ok(Array.isArray(value))
|
||||
t.equal(value.length, 2)
|
||||
t.equal(value[0], null)
|
||||
t.ok(buffer.equals(value[1]))
|
||||
}],
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/numeric'] = {
|
||||
format: 'text',
|
||||
id: 1231,
|
||||
tests: [
|
||||
['{1.2,3.4}', function (t, value) {
|
||||
t.deepEqual(value, [1.2, 3.4])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/int2'] = {
|
||||
format: 'text',
|
||||
id: 1005,
|
||||
tests: [
|
||||
['{-32768, -32767, 32766, 32767}', function (t, value) {
|
||||
t.deepEqual(value, [-32768, -32767, 32766, 32767])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/int4'] = {
|
||||
format: 'text',
|
||||
id: 1005,
|
||||
tests: [
|
||||
['{-2147483648, -2147483647, 2147483646, 2147483647}', function (t, value) {
|
||||
t.deepEqual(value, [-2147483648, -2147483647, 2147483646, 2147483647])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/int8'] = {
|
||||
format: 'text',
|
||||
id: 1016,
|
||||
tests: [
|
||||
[
|
||||
'{-9223372036854775808, -9223372036854775807, 9223372036854775806, 9223372036854775807}',
|
||||
function (t, value) {
|
||||
t.deepEqual(value, [
|
||||
'-9223372036854775808',
|
||||
'-9223372036854775807',
|
||||
'9223372036854775806',
|
||||
'9223372036854775807'
|
||||
])
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/json'] = {
|
||||
format: 'text',
|
||||
id: 199,
|
||||
tests: [
|
||||
[
|
||||
'{{1,2},{[3],"[4,5]"},{null,NULL}}',
|
||||
function (t, value) {
|
||||
t.deepEqual(value, [
|
||||
[1, 2],
|
||||
[[3], [4, 5]],
|
||||
[null, null],
|
||||
])
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/jsonb'] = {
|
||||
format: 'text',
|
||||
id: 3807,
|
||||
tests: exports['array/json'].tests
|
||||
}
|
||||
|
||||
exports['array/point'] = {
|
||||
format: 'text',
|
||||
id: 1017,
|
||||
tests: [
|
||||
['{"(25.1,50.5)","(10.1,40)"}', function (t, value) {
|
||||
t.deepEqual(value, [{x: 25.1, y: 50.5}, {x: 10.1, y: 40}])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/oid'] = {
|
||||
format: 'text',
|
||||
id: 1028,
|
||||
tests: [
|
||||
['{25864,25860}', function (t, value) {
|
||||
t.deepEqual(value, [25864, 25860])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/float4'] = {
|
||||
format: 'text',
|
||||
id: 1021,
|
||||
tests: [
|
||||
['{1.2, 3.4}', function (t, value) {
|
||||
t.deepEqual(value, [1.2, 3.4])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/float8'] = {
|
||||
format: 'text',
|
||||
id: 1022,
|
||||
tests: [
|
||||
['{-12345678.1234567, 12345678.12345678}', function (t, value) {
|
||||
t.deepEqual(value, [-12345678.1234567, 12345678.12345678])
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/date'] = {
|
||||
format: 'text',
|
||||
id: 1182,
|
||||
tests: [
|
||||
['{2014-01-01,2015-12-31}', function (t, value) {
|
||||
var expecteds = [new Date(2014, 0, 1), new Date(2015, 11, 31)]
|
||||
t.equal(value.length, 2)
|
||||
value.forEach(function (date, index) {
|
||||
var expected = expecteds[index]
|
||||
dateEquals(
|
||||
expected.getUTCFullYear(),
|
||||
expected.getUTCMonth(),
|
||||
expected.getUTCDate(),
|
||||
expected.getUTCHours(), 0, 0, 0)(t, date)
|
||||
})
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/interval'] = {
|
||||
format: 'text',
|
||||
id: 1187,
|
||||
tests: [
|
||||
['{01:02:03,1 day -00:00:03}', function (t, value) {
|
||||
var expecteds = [{hours: 1, minutes: 2, seconds: 3},
|
||||
{days: 1, seconds: -3}]
|
||||
t.equal(value.length, 2)
|
||||
t.deepEqual(value, expecteds);
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/inet'] = {
|
||||
format: 'text',
|
||||
id: 1041,
|
||||
tests: [
|
||||
['{8.8.8.8}', function (t, value) {
|
||||
t.deepEqual(value, ['8.8.8.8']);
|
||||
}],
|
||||
['{2001:4860:4860::8888}', function (t, value) {
|
||||
t.deepEqual(value, ['2001:4860:4860::8888']);
|
||||
}],
|
||||
['{127.0.0.1,fd00:1::40e,1.2.3.4}', function (t, value) {
|
||||
t.deepEqual(value, ['127.0.0.1', 'fd00:1::40e', '1.2.3.4']);
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/cidr'] = {
|
||||
format: 'text',
|
||||
id: 651,
|
||||
tests: [
|
||||
['{172.16.0.0/12}', function (t, value) {
|
||||
t.deepEqual(value, ['172.16.0.0/12']);
|
||||
}],
|
||||
['{fe80::/10}', function (t, value) {
|
||||
t.deepEqual(value, ['fe80::/10']);
|
||||
}],
|
||||
['{10.0.0.0/8,fc00::/7,192.168.0.0/24}', function (t, value) {
|
||||
t.deepEqual(value, ['10.0.0.0/8', 'fc00::/7', '192.168.0.0/24']);
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/macaddr'] = {
|
||||
format: 'text',
|
||||
id: 1040,
|
||||
tests: [
|
||||
['{08:00:2b:01:02:03,16:10:9f:0d:66:00}', function (t, value) {
|
||||
t.deepEqual(value, ['08:00:2b:01:02:03', '16:10:9f:0d:66:00']);
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['array/numrange'] = {
|
||||
format: 'text',
|
||||
id: 3907,
|
||||
tests: [
|
||||
['{"[1,2]","(4.5,8)","[10,40)","(-21.2,60.3]"}', function (t, value) {
|
||||
t.deepEqual(value, ['[1,2]', '(4.5,8)', '[10,40)', '(-21.2,60.3]']);
|
||||
}],
|
||||
['{"[,20]","[3,]","[,]","(,35)","(1,)","(,)"}', function (t, value) {
|
||||
t.deepEqual(value, ['[,20]', '[3,]', '[,]', '(,35)', '(1,)', '(,)']);
|
||||
}],
|
||||
['{"[,20)","[3,)","[,)","[,35)","[1,)","[,)"}', function (t, value) {
|
||||
t.deepEqual(value, ['[,20)', '[3,)', '[,)', '[,35)', '[1,)', '[,)']);
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-string/varchar'] = {
|
||||
format: 'binary',
|
||||
id: 1043,
|
||||
tests: [
|
||||
['bang', 'bang']
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-integer/int4'] = {
|
||||
format: 'binary',
|
||||
id: 23,
|
||||
tests: [
|
||||
[[0, 0, 0, 100], 100]
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-smallint/int2'] = {
|
||||
format: 'binary',
|
||||
id: 21,
|
||||
tests: [
|
||||
[[0, 101], 101]
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-bigint/int8'] = {
|
||||
format: 'binary',
|
||||
id: 20,
|
||||
tests: [
|
||||
[new Buffer([0x7f, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff]), '9223372036854775807']
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-oid'] = {
|
||||
format: 'binary',
|
||||
id: 26,
|
||||
tests: [
|
||||
[[0, 0, 0, 103], 103]
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-numeric'] = {
|
||||
format: 'binary',
|
||||
id: 1700,
|
||||
tests: [
|
||||
[
|
||||
[0, 2, 0, 0, 0, 0, 0, hex('0x64'), 0, 12, hex('0xd'), hex('0x48'), 0, 0, 0, 0],
|
||||
12.34
|
||||
]
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-real/float4'] = {
|
||||
format: 'binary',
|
||||
id: 700,
|
||||
tests: [
|
||||
[['0x41', '0x48', '0x00', '0x00'].map(hex), 12.5]
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-boolean'] = {
|
||||
format: 'binary',
|
||||
id: 16,
|
||||
tests: [
|
||||
[[1], true],
|
||||
[[0], false],
|
||||
[null, null]
|
||||
]
|
||||
}
|
||||
|
||||
exports['binary-string'] = {
|
||||
format: 'binary',
|
||||
id: 25,
|
||||
tests: [
|
||||
[
|
||||
new Buffer(['0x73', '0x6c', '0x61', '0x64', '0x64', '0x61'].map(hex)),
|
||||
'sladda'
|
||||
]
|
||||
]
|
||||
}
|
||||
|
||||
exports.point = {
|
||||
format: 'text',
|
||||
id: 600,
|
||||
tests: [
|
||||
['(25.1,50.5)', function (t, value) {
|
||||
t.deepEqual(value, {x: 25.1, y: 50.5})
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
exports.circle = {
|
||||
format: 'text',
|
||||
id: 718,
|
||||
tests: [
|
||||
['<(25,10),5>', function (t, value) {
|
||||
t.deepEqual(value, {x: 25, y: 10, radius: 5})
|
||||
}]
|
||||
]
|
||||
}
|
||||
|
||||
function hex (string) {
|
||||
return parseInt(string, 16)
|
||||
}
|
||||
|
||||
function dateEquals () {
|
||||
var timestamp = Date.UTC.apply(Date, arguments)
|
||||
return function (t, value) {
|
||||
t.equal(value.toUTCString(), new Date(timestamp).toUTCString())
|
||||
}
|
||||
}
|
||||
4
node_modules/pg/node_modules/postgres-array/index.d.ts
generated
vendored
Normal file
4
node_modules/pg/node_modules/postgres-array/index.d.ts
generated
vendored
Normal file
@ -0,0 +1,4 @@
|
||||
|
||||
export function parse(source: string): string[];
|
||||
export function parse<T>(source: string, transform: (value: string) => T): T[];
|
||||
|
||||
97
node_modules/pg/node_modules/postgres-array/index.js
generated
vendored
Normal file
97
node_modules/pg/node_modules/postgres-array/index.js
generated
vendored
Normal file
@ -0,0 +1,97 @@
|
||||
'use strict'
|
||||
|
||||
exports.parse = function (source, transform) {
|
||||
return new ArrayParser(source, transform).parse()
|
||||
}
|
||||
|
||||
class ArrayParser {
|
||||
constructor (source, transform) {
|
||||
this.source = source
|
||||
this.transform = transform || identity
|
||||
this.position = 0
|
||||
this.entries = []
|
||||
this.recorded = []
|
||||
this.dimension = 0
|
||||
}
|
||||
|
||||
isEof () {
|
||||
return this.position >= this.source.length
|
||||
}
|
||||
|
||||
nextCharacter () {
|
||||
var character = this.source[this.position++]
|
||||
if (character === '\\') {
|
||||
return {
|
||||
value: this.source[this.position++],
|
||||
escaped: true
|
||||
}
|
||||
}
|
||||
return {
|
||||
value: character,
|
||||
escaped: false
|
||||
}
|
||||
}
|
||||
|
||||
record (character) {
|
||||
this.recorded.push(character)
|
||||
}
|
||||
|
||||
newEntry (includeEmpty) {
|
||||
var entry
|
||||
if (this.recorded.length > 0 || includeEmpty) {
|
||||
entry = this.recorded.join('')
|
||||
if (entry === 'NULL' && !includeEmpty) {
|
||||
entry = null
|
||||
}
|
||||
if (entry !== null) entry = this.transform(entry)
|
||||
this.entries.push(entry)
|
||||
this.recorded = []
|
||||
}
|
||||
}
|
||||
|
||||
consumeDimensions () {
|
||||
if (this.source[0] === '[') {
|
||||
while (!this.isEof()) {
|
||||
var char = this.nextCharacter()
|
||||
if (char.value === '=') break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
parse (nested) {
|
||||
var character, parser, quote
|
||||
this.consumeDimensions()
|
||||
while (!this.isEof()) {
|
||||
character = this.nextCharacter()
|
||||
if (character.value === '{' && !quote) {
|
||||
this.dimension++
|
||||
if (this.dimension > 1) {
|
||||
parser = new ArrayParser(this.source.substr(this.position - 1), this.transform)
|
||||
this.entries.push(parser.parse(true))
|
||||
this.position += parser.position - 2
|
||||
}
|
||||
} else if (character.value === '}' && !quote) {
|
||||
this.dimension--
|
||||
if (!this.dimension) {
|
||||
this.newEntry()
|
||||
if (nested) return this.entries
|
||||
}
|
||||
} else if (character.value === '"' && !character.escaped) {
|
||||
if (quote) this.newEntry(true)
|
||||
quote = !quote
|
||||
} else if (character.value === ',' && !quote) {
|
||||
this.newEntry()
|
||||
} else {
|
||||
this.record(character.value)
|
||||
}
|
||||
}
|
||||
if (this.dimension !== 0) {
|
||||
throw new Error('array dimension not balanced')
|
||||
}
|
||||
return this.entries
|
||||
}
|
||||
}
|
||||
|
||||
function identity (value) {
|
||||
return value
|
||||
}
|
||||
21
node_modules/pg/node_modules/postgres-array/license
generated
vendored
Normal file
21
node_modules/pg/node_modules/postgres-array/license
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) Ben Drucker <bvdrucker@gmail.com> (bendrucker.me)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
35
node_modules/pg/node_modules/postgres-array/package.json
generated
vendored
Normal file
35
node_modules/pg/node_modules/postgres-array/package.json
generated
vendored
Normal file
@ -0,0 +1,35 @@
|
||||
{
|
||||
"name": "postgres-array",
|
||||
"main": "index.js",
|
||||
"version": "2.0.0",
|
||||
"description": "Parse postgres array columns",
|
||||
"license": "MIT",
|
||||
"repository": "bendrucker/postgres-array",
|
||||
"author": {
|
||||
"name": "Ben Drucker",
|
||||
"email": "bvdrucker@gmail.com",
|
||||
"url": "bendrucker.me"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "standard && tape test.js"
|
||||
},
|
||||
"types": "index.d.ts",
|
||||
"keywords": [
|
||||
"postgres",
|
||||
"array",
|
||||
"parser"
|
||||
],
|
||||
"dependencies": {},
|
||||
"devDependencies": {
|
||||
"standard": "^12.0.1",
|
||||
"tape": "^4.0.0"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"index.d.ts",
|
||||
"readme.md"
|
||||
]
|
||||
}
|
||||
43
node_modules/pg/node_modules/postgres-array/readme.md
generated
vendored
Normal file
43
node_modules/pg/node_modules/postgres-array/readme.md
generated
vendored
Normal file
@ -0,0 +1,43 @@
|
||||
# postgres-array [](https://travis-ci.org/bendrucker/postgres-array)
|
||||
|
||||
> Parse postgres array columns
|
||||
|
||||
|
||||
## Install
|
||||
|
||||
```
|
||||
$ npm install --save postgres-array
|
||||
```
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
```js
|
||||
var postgresArray = require('postgres-array')
|
||||
|
||||
postgresArray.parse('{1,2,3}', (value) => parseInt(value, 10))
|
||||
//=> [1, 2, 3]
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
#### `parse(input, [transform])` -> `array`
|
||||
|
||||
##### input
|
||||
|
||||
*Required*
|
||||
Type: `string`
|
||||
|
||||
A Postgres array string.
|
||||
|
||||
##### transform
|
||||
|
||||
Type: `function`
|
||||
Default: `identity`
|
||||
|
||||
A function that transforms non-null values inserted into the array.
|
||||
|
||||
|
||||
## License
|
||||
|
||||
MIT © [Ben Drucker](http://bendrucker.me)
|
||||
31
node_modules/pg/node_modules/postgres-bytea/index.js
generated
vendored
Normal file
31
node_modules/pg/node_modules/postgres-bytea/index.js
generated
vendored
Normal file
@ -0,0 +1,31 @@
|
||||
'use strict'
|
||||
|
||||
module.exports = function parseBytea (input) {
|
||||
if (/^\\x/.test(input)) {
|
||||
// new 'hex' style response (pg >9.0)
|
||||
return new Buffer(input.substr(2), 'hex')
|
||||
}
|
||||
var output = ''
|
||||
var i = 0
|
||||
while (i < input.length) {
|
||||
if (input[i] !== '\\') {
|
||||
output += input[i]
|
||||
++i
|
||||
} else {
|
||||
if (/[0-7]{3}/.test(input.substr(i + 1, 3))) {
|
||||
output += String.fromCharCode(parseInt(input.substr(i + 1, 3), 8))
|
||||
i += 4
|
||||
} else {
|
||||
var backslashes = 1
|
||||
while (i + backslashes < input.length && input[i + backslashes] === '\\') {
|
||||
backslashes++
|
||||
}
|
||||
for (var k = 0; k < Math.floor(backslashes / 2); ++k) {
|
||||
output += '\\'
|
||||
}
|
||||
i += Math.floor(backslashes / 2) * 2
|
||||
}
|
||||
}
|
||||
}
|
||||
return new Buffer(output, 'binary')
|
||||
}
|
||||
21
node_modules/pg/node_modules/postgres-bytea/license
generated
vendored
Normal file
21
node_modules/pg/node_modules/postgres-bytea/license
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) Ben Drucker <bvdrucker@gmail.com> (bendrucker.me)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
34
node_modules/pg/node_modules/postgres-bytea/package.json
generated
vendored
Normal file
34
node_modules/pg/node_modules/postgres-bytea/package.json
generated
vendored
Normal file
@ -0,0 +1,34 @@
|
||||
{
|
||||
"name": "postgres-bytea",
|
||||
"main": "index.js",
|
||||
"version": "1.0.0",
|
||||
"description": "Postgres bytea parser",
|
||||
"license": "MIT",
|
||||
"repository": "bendrucker/postgres-bytea",
|
||||
"author": {
|
||||
"name": "Ben Drucker",
|
||||
"email": "bvdrucker@gmail.com",
|
||||
"url": "bendrucker.me"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "standard && tape test.js"
|
||||
},
|
||||
"keywords": [
|
||||
"bytea",
|
||||
"postgres",
|
||||
"binary",
|
||||
"parser"
|
||||
],
|
||||
"dependencies": {},
|
||||
"devDependencies": {
|
||||
"tape": "^4.0.0",
|
||||
"standard": "^4.0.0"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"readme.md"
|
||||
]
|
||||
}
|
||||
34
node_modules/pg/node_modules/postgres-bytea/readme.md
generated
vendored
Normal file
34
node_modules/pg/node_modules/postgres-bytea/readme.md
generated
vendored
Normal file
@ -0,0 +1,34 @@
|
||||
# postgres-bytea [](https://travis-ci.org/bendrucker/postgres-bytea)
|
||||
|
||||
> Postgres bytea parser
|
||||
|
||||
|
||||
## Install
|
||||
|
||||
```
|
||||
$ npm install --save postgres-bytea
|
||||
```
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
```js
|
||||
var bytea = require('postgres-bytea');
|
||||
bytea('\\000\\100\\200')
|
||||
//=> buffer
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
#### `bytea(input)` -> `buffer`
|
||||
|
||||
##### input
|
||||
|
||||
*Required*
|
||||
Type: `string`
|
||||
|
||||
A Postgres bytea binary string.
|
||||
|
||||
## License
|
||||
|
||||
MIT © [Ben Drucker](http://bendrucker.me)
|
||||
116
node_modules/pg/node_modules/postgres-date/index.js
generated
vendored
Normal file
116
node_modules/pg/node_modules/postgres-date/index.js
generated
vendored
Normal file
@ -0,0 +1,116 @@
|
||||
'use strict'
|
||||
|
||||
var DATE_TIME = /(\d{1,})-(\d{2})-(\d{2}) (\d{2}):(\d{2}):(\d{2})(\.\d{1,})?.*?( BC)?$/
|
||||
var DATE = /^(\d{1,})-(\d{2})-(\d{2})( BC)?$/
|
||||
var TIME_ZONE = /([Z+-])(\d{2})?:?(\d{2})?:?(\d{2})?/
|
||||
var INFINITY = /^-?infinity$/
|
||||
|
||||
module.exports = function parseDate (isoDate) {
|
||||
if (INFINITY.test(isoDate)) {
|
||||
// Capitalize to Infinity before passing to Number
|
||||
return Number(isoDate.replace('i', 'I'))
|
||||
}
|
||||
var matches = DATE_TIME.exec(isoDate)
|
||||
|
||||
if (!matches) {
|
||||
// Force YYYY-MM-DD dates to be parsed as local time
|
||||
return getDate(isoDate) || null
|
||||
}
|
||||
|
||||
var isBC = !!matches[8]
|
||||
var year = parseInt(matches[1], 10)
|
||||
if (isBC) {
|
||||
year = bcYearToNegativeYear(year)
|
||||
}
|
||||
|
||||
var month = parseInt(matches[2], 10) - 1
|
||||
var day = matches[3]
|
||||
var hour = parseInt(matches[4], 10)
|
||||
var minute = parseInt(matches[5], 10)
|
||||
var second = parseInt(matches[6], 10)
|
||||
|
||||
var ms = matches[7]
|
||||
ms = ms ? 1000 * parseFloat(ms) : 0
|
||||
|
||||
var date
|
||||
var offset = timeZoneOffset(isoDate)
|
||||
if (offset != null) {
|
||||
date = new Date(Date.UTC(year, month, day, hour, minute, second, ms))
|
||||
|
||||
// Account for years from 0 to 99 being interpreted as 1900-1999
|
||||
// by Date.UTC / the multi-argument form of the Date constructor
|
||||
if (is0To99(year)) {
|
||||
date.setUTCFullYear(year)
|
||||
}
|
||||
|
||||
if (offset !== 0) {
|
||||
date.setTime(date.getTime() - offset)
|
||||
}
|
||||
} else {
|
||||
date = new Date(year, month, day, hour, minute, second, ms)
|
||||
|
||||
if (is0To99(year)) {
|
||||
date.setFullYear(year)
|
||||
}
|
||||
}
|
||||
|
||||
return date
|
||||
}
|
||||
|
||||
function getDate (isoDate) {
|
||||
var matches = DATE.exec(isoDate)
|
||||
if (!matches) {
|
||||
return
|
||||
}
|
||||
|
||||
var year = parseInt(matches[1], 10)
|
||||
var isBC = !!matches[4]
|
||||
if (isBC) {
|
||||
year = bcYearToNegativeYear(year)
|
||||
}
|
||||
|
||||
var month = parseInt(matches[2], 10) - 1
|
||||
var day = matches[3]
|
||||
// YYYY-MM-DD will be parsed as local time
|
||||
var date = new Date(year, month, day)
|
||||
|
||||
if (is0To99(year)) {
|
||||
date.setFullYear(year)
|
||||
}
|
||||
|
||||
return date
|
||||
}
|
||||
|
||||
// match timezones:
|
||||
// Z (UTC)
|
||||
// -05
|
||||
// +06:30
|
||||
function timeZoneOffset (isoDate) {
|
||||
if (isoDate.endsWith('+00')) {
|
||||
return 0
|
||||
}
|
||||
|
||||
var zone = TIME_ZONE.exec(isoDate.split(' ')[1])
|
||||
if (!zone) return
|
||||
var type = zone[1]
|
||||
|
||||
if (type === 'Z') {
|
||||
return 0
|
||||
}
|
||||
var sign = type === '-' ? -1 : 1
|
||||
var offset = parseInt(zone[2], 10) * 3600 +
|
||||
parseInt(zone[3] || 0, 10) * 60 +
|
||||
parseInt(zone[4] || 0, 10)
|
||||
|
||||
return offset * sign * 1000
|
||||
}
|
||||
|
||||
function bcYearToNegativeYear (year) {
|
||||
// Account for numerical difference between representations of BC years
|
||||
// See: https://github.com/bendrucker/postgres-date/issues/5
|
||||
return -(year - 1)
|
||||
}
|
||||
|
||||
function is0To99 (num) {
|
||||
return num >= 0 && num < 100
|
||||
}
|
||||
21
node_modules/pg/node_modules/postgres-date/license
generated
vendored
Normal file
21
node_modules/pg/node_modules/postgres-date/license
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) Ben Drucker <bvdrucker@gmail.com> (bendrucker.me)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
33
node_modules/pg/node_modules/postgres-date/package.json
generated
vendored
Normal file
33
node_modules/pg/node_modules/postgres-date/package.json
generated
vendored
Normal file
@ -0,0 +1,33 @@
|
||||
{
|
||||
"name": "postgres-date",
|
||||
"main": "index.js",
|
||||
"version": "1.0.7",
|
||||
"description": "Postgres date column parser",
|
||||
"license": "MIT",
|
||||
"repository": "bendrucker/postgres-date",
|
||||
"author": {
|
||||
"name": "Ben Drucker",
|
||||
"email": "bvdrucker@gmail.com",
|
||||
"url": "bendrucker.me"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "standard && tape test.js"
|
||||
},
|
||||
"keywords": [
|
||||
"postgres",
|
||||
"date",
|
||||
"parser"
|
||||
],
|
||||
"dependencies": {},
|
||||
"devDependencies": {
|
||||
"standard": "^14.0.0",
|
||||
"tape": "^5.0.0"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"readme.md"
|
||||
]
|
||||
}
|
||||
49
node_modules/pg/node_modules/postgres-date/readme.md
generated
vendored
Normal file
49
node_modules/pg/node_modules/postgres-date/readme.md
generated
vendored
Normal file
@ -0,0 +1,49 @@
|
||||
# postgres-date [](https://travis-ci.org/bendrucker/postgres-date) [](https://greenkeeper.io/)
|
||||
|
||||
> Postgres date output parser
|
||||
|
||||
This package parses [date/time outputs](https://www.postgresql.org/docs/current/datatype-datetime.html#DATATYPE-DATETIME-OUTPUT) from Postgres into Javascript `Date` objects. Its goal is to match Postgres behavior and preserve data accuracy.
|
||||
|
||||
If you find a case where a valid Postgres output results in incorrect parsing (including loss of precision), please [create a pull request](https://github.com/bendrucker/postgres-date/compare) and provide a failing test.
|
||||
|
||||
**Supported Postgres Versions:** `>= 9.6`
|
||||
|
||||
All prior versions of Postgres are likely compatible but not officially supported.
|
||||
|
||||
## Install
|
||||
|
||||
```
|
||||
$ npm install --save postgres-date
|
||||
```
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
```js
|
||||
var parse = require('postgres-date')
|
||||
parse('2011-01-23 22:15:51Z')
|
||||
// => 2011-01-23T22:15:51.000Z
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
#### `parse(isoDate)` -> `date`
|
||||
|
||||
##### isoDate
|
||||
|
||||
*Required*
|
||||
Type: `string`
|
||||
|
||||
A date string from Postgres.
|
||||
|
||||
## Releases
|
||||
|
||||
The following semantic versioning increments will be used for changes:
|
||||
|
||||
* **Major**: Removal of support for Node.js versions or Postgres versions (not expected)
|
||||
* **Minor**: Unused, since Postgres returns dates in standard ISO 8601 format
|
||||
* **Patch**: Any fix for parsing behavior
|
||||
|
||||
## License
|
||||
|
||||
MIT © [Ben Drucker](http://bendrucker.me)
|
||||
20
node_modules/pg/node_modules/postgres-interval/index.d.ts
generated
vendored
Normal file
20
node_modules/pg/node_modules/postgres-interval/index.d.ts
generated
vendored
Normal file
@ -0,0 +1,20 @@
|
||||
declare namespace PostgresInterval {
|
||||
export interface IPostgresInterval {
|
||||
years?: number;
|
||||
months?: number;
|
||||
days?: number;
|
||||
hours?: number;
|
||||
minutes?: number;
|
||||
seconds?: number;
|
||||
milliseconds?: number;
|
||||
|
||||
toPostgres(): string;
|
||||
|
||||
toISO(): string;
|
||||
toISOString(): string;
|
||||
}
|
||||
}
|
||||
|
||||
declare function PostgresInterval(raw: string): PostgresInterval.IPostgresInterval;
|
||||
|
||||
export = PostgresInterval;
|
||||
125
node_modules/pg/node_modules/postgres-interval/index.js
generated
vendored
Normal file
125
node_modules/pg/node_modules/postgres-interval/index.js
generated
vendored
Normal file
@ -0,0 +1,125 @@
|
||||
'use strict'
|
||||
|
||||
var extend = require('xtend/mutable')
|
||||
|
||||
module.exports = PostgresInterval
|
||||
|
||||
function PostgresInterval (raw) {
|
||||
if (!(this instanceof PostgresInterval)) {
|
||||
return new PostgresInterval(raw)
|
||||
}
|
||||
extend(this, parse(raw))
|
||||
}
|
||||
var properties = ['seconds', 'minutes', 'hours', 'days', 'months', 'years']
|
||||
PostgresInterval.prototype.toPostgres = function () {
|
||||
var filtered = properties.filter(this.hasOwnProperty, this)
|
||||
|
||||
// In addition to `properties`, we need to account for fractions of seconds.
|
||||
if (this.milliseconds && filtered.indexOf('seconds') < 0) {
|
||||
filtered.push('seconds')
|
||||
}
|
||||
|
||||
if (filtered.length === 0) return '0'
|
||||
return filtered
|
||||
.map(function (property) {
|
||||
var value = this[property] || 0
|
||||
|
||||
// Account for fractional part of seconds,
|
||||
// remove trailing zeroes.
|
||||
if (property === 'seconds' && this.milliseconds) {
|
||||
value = (value + this.milliseconds / 1000).toFixed(6).replace(/\.?0+$/, '')
|
||||
}
|
||||
|
||||
return value + ' ' + property
|
||||
}, this)
|
||||
.join(' ')
|
||||
}
|
||||
|
||||
var propertiesISOEquivalent = {
|
||||
years: 'Y',
|
||||
months: 'M',
|
||||
days: 'D',
|
||||
hours: 'H',
|
||||
minutes: 'M',
|
||||
seconds: 'S'
|
||||
}
|
||||
var dateProperties = ['years', 'months', 'days']
|
||||
var timeProperties = ['hours', 'minutes', 'seconds']
|
||||
// according to ISO 8601
|
||||
PostgresInterval.prototype.toISOString = PostgresInterval.prototype.toISO = function () {
|
||||
var datePart = dateProperties
|
||||
.map(buildProperty, this)
|
||||
.join('')
|
||||
|
||||
var timePart = timeProperties
|
||||
.map(buildProperty, this)
|
||||
.join('')
|
||||
|
||||
return 'P' + datePart + 'T' + timePart
|
||||
|
||||
function buildProperty (property) {
|
||||
var value = this[property] || 0
|
||||
|
||||
// Account for fractional part of seconds,
|
||||
// remove trailing zeroes.
|
||||
if (property === 'seconds' && this.milliseconds) {
|
||||
value = (value + this.milliseconds / 1000).toFixed(6).replace(/0+$/, '')
|
||||
}
|
||||
|
||||
return value + propertiesISOEquivalent[property]
|
||||
}
|
||||
}
|
||||
|
||||
var NUMBER = '([+-]?\\d+)'
|
||||
var YEAR = NUMBER + '\\s+years?'
|
||||
var MONTH = NUMBER + '\\s+mons?'
|
||||
var DAY = NUMBER + '\\s+days?'
|
||||
var TIME = '([+-])?([\\d]*):(\\d\\d):(\\d\\d)\\.?(\\d{1,6})?'
|
||||
var INTERVAL = new RegExp([YEAR, MONTH, DAY, TIME].map(function (regexString) {
|
||||
return '(' + regexString + ')?'
|
||||
})
|
||||
.join('\\s*'))
|
||||
|
||||
// Positions of values in regex match
|
||||
var positions = {
|
||||
years: 2,
|
||||
months: 4,
|
||||
days: 6,
|
||||
hours: 9,
|
||||
minutes: 10,
|
||||
seconds: 11,
|
||||
milliseconds: 12
|
||||
}
|
||||
// We can use negative time
|
||||
var negatives = ['hours', 'minutes', 'seconds', 'milliseconds']
|
||||
|
||||
function parseMilliseconds (fraction) {
|
||||
// add omitted zeroes
|
||||
var microseconds = fraction + '000000'.slice(fraction.length)
|
||||
return parseInt(microseconds, 10) / 1000
|
||||
}
|
||||
|
||||
function parse (interval) {
|
||||
if (!interval) return {}
|
||||
var matches = INTERVAL.exec(interval)
|
||||
var isNegative = matches[8] === '-'
|
||||
return Object.keys(positions)
|
||||
.reduce(function (parsed, property) {
|
||||
var position = positions[property]
|
||||
var value = matches[position]
|
||||
// no empty string
|
||||
if (!value) return parsed
|
||||
// milliseconds are actually microseconds (up to 6 digits)
|
||||
// with omitted trailing zeroes.
|
||||
value = property === 'milliseconds'
|
||||
? parseMilliseconds(value)
|
||||
: parseInt(value, 10)
|
||||
// no zeros
|
||||
if (!value) return parsed
|
||||
if (isNegative && ~negatives.indexOf(property)) {
|
||||
value *= -1
|
||||
}
|
||||
parsed[property] = value
|
||||
return parsed
|
||||
}, {})
|
||||
}
|
||||
21
node_modules/pg/node_modules/postgres-interval/license
generated
vendored
Normal file
21
node_modules/pg/node_modules/postgres-interval/license
generated
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) Ben Drucker <bvdrucker@gmail.com> (bendrucker.me)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
36
node_modules/pg/node_modules/postgres-interval/package.json
generated
vendored
Normal file
36
node_modules/pg/node_modules/postgres-interval/package.json
generated
vendored
Normal file
@ -0,0 +1,36 @@
|
||||
{
|
||||
"name": "postgres-interval",
|
||||
"main": "index.js",
|
||||
"version": "1.2.0",
|
||||
"description": "Parse Postgres interval columns",
|
||||
"license": "MIT",
|
||||
"repository": "bendrucker/postgres-interval",
|
||||
"author": {
|
||||
"name": "Ben Drucker",
|
||||
"email": "bvdrucker@gmail.com",
|
||||
"url": "bendrucker.me"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "standard && tape test.js"
|
||||
},
|
||||
"keywords": [
|
||||
"postgres",
|
||||
"interval",
|
||||
"parser"
|
||||
],
|
||||
"dependencies": {
|
||||
"xtend": "^4.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"tape": "^4.0.0",
|
||||
"standard": "^12.0.1"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"index.d.ts",
|
||||
"readme.md"
|
||||
]
|
||||
}
|
||||
48
node_modules/pg/node_modules/postgres-interval/readme.md
generated
vendored
Normal file
48
node_modules/pg/node_modules/postgres-interval/readme.md
generated
vendored
Normal file
@ -0,0 +1,48 @@
|
||||
# postgres-interval [](https://travis-ci.org/bendrucker/postgres-interval) [](https://greenkeeper.io/)
|
||||
|
||||
> Parse Postgres interval columns
|
||||
|
||||
|
||||
## Install
|
||||
|
||||
```
|
||||
$ npm install --save postgres-interval
|
||||
```
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
```js
|
||||
var parse = require('postgres-interval')
|
||||
var interval = parse('01:02:03')
|
||||
//=> {hours: 1, minutes: 2, seconds: 3}
|
||||
interval.toPostgres()
|
||||
// 3 seconds 2 minutes 1 hours
|
||||
interval.toISO()
|
||||
// P0Y0M0DT1H2M3S
|
||||
```
|
||||
|
||||
## API
|
||||
|
||||
#### `parse(pgInterval)` -> `interval`
|
||||
|
||||
##### pgInterval
|
||||
|
||||
*Required*
|
||||
Type: `string`
|
||||
|
||||
A Postgres interval string.
|
||||
|
||||
#### `interval.toPostgres()` -> `string`
|
||||
|
||||
Returns an interval string. This allows the interval object to be passed into prepared statements.
|
||||
|
||||
#### `interval.toISOString()` -> `string`
|
||||
|
||||
Returns an [ISO 8601](https://en.wikipedia.org/wiki/ISO_8601#Durations) compliant string.
|
||||
|
||||
Also available as `interval.toISO()` for backwards compatibility.
|
||||
|
||||
## License
|
||||
|
||||
MIT © [Ben Drucker](http://bendrucker.me)
|
||||
62
node_modules/pg/package.json
generated
vendored
Normal file
62
node_modules/pg/package.json
generated
vendored
Normal file
@ -0,0 +1,62 @@
|
||||
{
|
||||
"name": "pg",
|
||||
"version": "8.13.1",
|
||||
"description": "PostgreSQL client - pure javascript & libpq with the same API",
|
||||
"keywords": [
|
||||
"database",
|
||||
"libpq",
|
||||
"pg",
|
||||
"postgre",
|
||||
"postgres",
|
||||
"postgresql",
|
||||
"rdbms"
|
||||
],
|
||||
"homepage": "https://github.com/brianc/node-postgres",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/brianc/node-postgres.git",
|
||||
"directory": "packages/pg"
|
||||
},
|
||||
"author": "Brian Carlson <brian.m.carlson@gmail.com>",
|
||||
"main": "./lib",
|
||||
"dependencies": {
|
||||
"pg-connection-string": "^2.7.0",
|
||||
"pg-pool": "^3.7.0",
|
||||
"pg-protocol": "^1.7.0",
|
||||
"pg-types": "^2.1.0",
|
||||
"pgpass": "1.x"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@cloudflare/workers-types": "^4.20230404.0",
|
||||
"async": "2.6.4",
|
||||
"bluebird": "3.7.2",
|
||||
"co": "4.6.0",
|
||||
"pg-copy-streams": "0.3.0",
|
||||
"typescript": "^4.0.3",
|
||||
"workerd": "^1.20230419.0",
|
||||
"wrangler": "3.58.0"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"pg-cloudflare": "^1.1.1"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"pg-native": ">=3.0.1"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"pg-native": {
|
||||
"optional": true
|
||||
}
|
||||
},
|
||||
"scripts": {
|
||||
"test": "make test-all"
|
||||
},
|
||||
"files": [
|
||||
"lib",
|
||||
"SPONSORS.md"
|
||||
],
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 8.0.0"
|
||||
},
|
||||
"gitHead": "95d7e620ef8b51743b4cbca05dd3c3ce858ecea7"
|
||||
}
|
||||
Reference in New Issue
Block a user