1
0
mirror of https://github.com/mgerb/mywebsite synced 2026-01-12 02:42:48 +00:00

Added files

This commit is contained in:
2015-06-25 16:28:41 -05:00
parent 656dca9289
commit eb27b55a54
5621 changed files with 1630154 additions and 0 deletions

20
node_modules/nodemailer/.jshintrc generated vendored Normal file
View File

@@ -0,0 +1,20 @@
{
"indent": 4,
"node": true,
"globalstrict": true,
"evil": true,
"unused": true,
"undef": true,
"newcap": true,
"esnext": true,
"curly": true,
"eqeqeq": true,
"expr": true,
"predef": [
"describe",
"it",
"beforeEach",
"afterEach"
]
}

4
node_modules/nodemailer/.npmignore generated vendored Normal file
View File

@@ -0,0 +1,4 @@
.travis.yml
assets
examples
test

215
node_modules/nodemailer/CHANGELOG.md generated vendored Normal file
View File

@@ -0,0 +1,215 @@
# Changelog
## v1.3.4 2015-04-25
Maintenance release, bumped buildmail version to get fixed format=flowed handling
## v1.3.3 2015-04-25
Maintenance release, bumped dependencies
## v1.3.2 2015-03-09
Maintenance release, upgraded dependencies. Replaced simplesmtp based tests with smtp-server based ones.
## v1.3.0 2014-09-12
Maintenance release, upgrades buildmail and libmime. Allows using functions as transform plugins and fixes issue with unicode filenames in Gmail.
## v1.2.2 2014-09-05
Proper handling of data uris as attachments. Attachment `path` property can also be defined as a data uri, not just regular url or file path.
## v1.2.1 2014-08-21
Bumped libmime and mailbuild versions to properly handle filenames with spaces (short ascii only filenames with spaces were left unquoted).
## v1.2.0 2014-08-18
Allow using encoded strings as attachments. Added new property `encoding` which defines the encoding used for a `content` string. If encoding is set, the content value is converted to a Buffer value using the defined encoding before usage. Useful for including binary attachemnts in JSON formatted email objects.
## v1.1.2 2014-08-18
Return deprecatin error for v0.x style configuration
## v1.1.1 2014-07-30
Bumped nodemailer-direct-transport dependency. Updated version includes a bugfix for Stream nodes handling. Important only if use direct-transport with Streams (not file paths or urls) as attachment content.
## v1.1.0 2014-07-29
Added new method `resolveContent()` to get the html/text/attachment content as a String or Buffer.
## v1.0.4 2014-07-23
Bugfix release. HTML node was instered twice if the message consisted of a HTML
content (but no text content) + at least one attachment with CID + at least
one attachment without CID. In this case the HTML node was inserted both to
the root level multipart/mixed section and to the multipart/related sub section
## v1.0.3 2014-07-16
Fixed a bug where Nodemailer crashed if the message content type was multipart/related
## v1.0.2 2014-07-16
Upgraded nodemailer-smtp-transport to 0.1.11. The docs state that for SSL you should use 'secure' option but the underlying smtp-connection module used 'secureConnection' for this purpose. Fixed smpt-connection to match the docs.
## v1.0.1 2014-07-15
Implemented missing #close method that is passed to the underlying transport object. Required by the smtp pool.
## v1.0.0 2014-07-15
Total rewrite. See migration guide here: http://www.andrisreinman.com/nodemailer-v1-0/#migrationguide
## v0.7.1 2014-07-09
* Upgraded aws-sdk to 2.0.5
## v0.7.0 2014-06-17
* Bumped version to v0.7.0
* Fix AWS-SES usage [5b6bc144]
* Replace current SES with new SES using AWS-SDK (Elanorr) [c79d797a]
* Updated README.md about Node Email Templates (niftylettuce) [e52bef81]
## v0.6.5 2014-05-15
* Bumped version to v0.6.5
* Use tildes instead of carets for dependency listing [5296ce41]
* Allow clients to set a custom identityString (venables) [5373287d]
* bugfix (adding "-i" to sendmail command line for each new mail) by copying this.args (vrodic) [05a8a9a3]
* update copyright (gdi2290) [3a6cba3a]
## v0.6.4 2014-05-13
* Bumped version to v0.6.4
* added npmignore, bumped dependencies [21bddcd9]
* Add AOL to well-known services (msouce) [da7dd3b7]
## v0.6.3 2014-04-16
* Bumped version to v0.6.3
* Upgraded simplesmtp dependency [dd367f59]
## v0.6.2 2014-04-09
* Bumped version to v0.6.2
* Added error option to Stub transport [c423acad]
* Use SVG npm badge (t3chnoboy) [677117b7]
* add SendCloud to well known services (haio) [43c358e0]
* High-res build-passing and NPM module badges (sahat) [9fdc37cd]
## v0.6.1 2014-01-26
* Bumped version to v0.6.1
* Do not throw on multiple errors from sendmail command [c6e2cd12]
* Do not require callback for pickup, fixes #238 [93eb3214]
* Added AWSSecurityToken information to README, fixes #235 [58e921d1]
* Added Nodemailer logo [06b7d1a8]
## v0.6.0 2013-12-30
* Bumped version to v0.6.0
* Allow defining custom transport methods [ec5b48ce]
* Return messageId with responseObject for all built in transport methods [74445cec]
* Bumped dependency versions for mailcomposer and readable-stream [9a034c34]
* Changed pickup argument name to 'directory' [01c3ea53]
* Added support for IIS pickup directory with PICKUP transport (philipproplesch) [36940b59..360a2878]
* Applied common styles [9e93a409]
* Updated readme [c78075e7]
## v0.5.15 2013-12-13
* bumped version to v0.5.15
* Updated README, added global options info for setting uo transports [554bb0e5]
* Resolve public hostname, if resolveHostname property for a transport object is set to `true` [9023a6e1..4c66b819]
## v0.5.14 2013-12-05
* bumped version to v0.5.14
* Expose status for direct messages [f0312df6]
* Allow to skip the X-Mailer header if xMailer value is set to 'false' [f2c20a68]
## v0.5.13 2013-12-03
* bumped version to v0.5.13
* Use the name property from the transport object to use for the domain part of message-id values (1598eee9)
## v0.5.12 2013-12-02
* bumped version to v0.5.12
* Expose transport method and transport module version if available [a495106e]
* Added 'he' module instead of using custom html entity decoding [c197d102]
* Added xMailer property for transport configuration object to override X-Mailer value [e8733a61]
* Updated README, added description for 'mail' method [e1f5f3a6]
## v0.5.11 2013-11-28
* bumped version to v0.5.11
* Updated mailcomposer version. Replaces ent with he [6a45b790e]
## v0.5.10 2013-11-26
* bumped version to v0.5.10
* added shorthand function mail() for direct transport type [88129bd7]
* minor tweaks and typo fixes [f797409e..ceac0ca4]
## v0.5.9 2013-11-25
* bumped version to v0.5.9
* Update for 'direct' handling [77b84e2f]
* do not require callback to be provided for 'direct' type [ec51c79f]
## v0.5.8 2013-11-22
* bumped version to v0.5.8
* Added support for 'direct' transport [826f226d..0dbbcbbc]
## v0.5.7 2013-11-18
* bumped version to v0.5.7
* Replace \r\n by \n in Sendmail transport (rolftimmermans) [fed2089e..616ec90c]
A lot of sendmail implementations choke on \r\n newlines and require \n
This commit addresses this by transforming all \r\n sequences passed to
the sendmail command with \n
## v0.5.6 2013-11-15
* bumped version to v0.5.6
* Upgraded mailcomposer dependency to 0.2.4 [e5ff9c40]
* Removed noCR option [e810d1b8]
* Update wellknown.js, added FastMail (k-j-kleist) [cf930f6d]
## v0.5.5 2013-10-30
* bumped version to v0.5.5
* Updated mailcomposer dependnecy version to 0.2.3
* Remove legacy code - node v0.4 is not supported anymore anyway
* Use hostname (autodetected or from the options.name property) for Message-Id instead of "Nodemailer" (helps a bit when messages are identified as spam)
* Added maxMessages info to README
## v0.5.4 2013-10-29
* bumped version to v0.5.4
* added "use strict" statements
* Added DSN info to README
* add support for QQ enterprise email (coderhaoxin)
* Add a Bitdeli Badge to README
* DSN options Passthrought into simplesmtp. (irvinzz)
## v0.5.3 2013-10-03
* bumped version v0.5.3
* Using a stub transport to prevent sendmail from being called during a test. (jsdevel)
* closes #78: sendmail transport does not work correctly on Unix machines. (jsdevel)
* Updated PaaS Support list to include Modulus. (fiveisprime)
* Translate self closing break tags to newline (kosmasgiannis)
* fix typos (aeosynth)
## v0.5.2 2013-07-25
* bumped version v0.5.2
* Merge pull request #177 from MrSwitch/master
Fixing Amazon SES, fatal error caused by bad connection

30
node_modules/nodemailer/Gruntfile.js generated vendored Normal file
View File

@@ -0,0 +1,30 @@
'use strict';
module.exports = function(grunt) {
// Project configuration.
grunt.initConfig({
jshint: {
all: ['src/*.js', 'test/*.js', 'examples/*.js', 'Gruntfile.js'],
options: {
jshintrc: '.jshintrc'
}
},
mochaTest: {
all: {
options: {
reporter: 'spec'
},
src: ['test/*-test.js']
}
}
});
// Load the plugin(s)
grunt.loadNpmTasks('grunt-contrib-jshint');
grunt.loadNpmTasks('grunt-mocha-test');
// Tasks
grunt.registerTask('default', ['jshint', 'mochaTest']);
};

16
node_modules/nodemailer/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,16 @@
Copyright (c) 2011-2015 Andris Reinman
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

620
node_modules/nodemailer/README.md generated vendored Normal file
View File

@@ -0,0 +1,620 @@
![Nodemailer](https://raw.githubusercontent.com/andris9/Nodemailer/master/assets/nm_logo_200x136.png)
Send e-mails from Node.js easy as cake!
[![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/andris9/Nodemailer?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[![Build Status](https://secure.travis-ci.org/andris9/Nodemailer.svg)](http://travis-ci.org/andris9/Nodemailer)
<a href="http://badge.fury.io/js/nodemailer"><img src="https://badge.fury.io/js/nodemailer.svg" alt="NPM version" height="18"></a>
## Upgrade warning
Do not upgrade Nodemailer from 0.7 or lower to 1.0 as there are breaking changes. You can continue to use the 0.7 branch as long as you like. See the documentation for 0.7 [here](https://github.com/andris9/Nodemailer/blob/0.7/README.md).
### Migration guide
See the migration guide from 0.7 to 1.0 [in the 1.0 release blog post](http://www.andrisreinman.com/nodemailer-v1-0/#migrationguide).
## Notes and information
### Nodemailer supports
* **Unicode** to use any characters
* **Windows** you can install it with *npm* on Windows just like any other module, there are no compiled dependencies. Use it from Azure or from your Windows box hassle free.
* **HTML content** as well as **plain text** alternative
* **Attachments** (including attachment **streaming** for sending larger files)
* **Embedded images** in HTML
* Secure e-mail delivery using **SSL/STARTTLS**
* Different **transport methods**, either using built in transports or from external plugins
* Custom **Plugin support** for manipulating messages (add DKIM signatures, use markdown content instead of HTML etc.)
* Sane **XOAUTH2** login with automatic access token generation (and feedback about the updated tokens)
### Support Nodemailer development
[![Donate to author](https://www.paypalobjects.com/en_US/i/btn/btn_donate_SM.gif)](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=DB26KWR2BQX5W)
If you want to support with Bitcoins, then my wallet address is `15Z8ADxhssKUiwP3jbbqJwA21744KMCfTM`
## TL;DR Usage Example
This is a complete example to send an e-mail with plaintext and HTML body
```javascript
var nodemailer = require('nodemailer');
// create reusable transporter object using SMTP transport
var transporter = nodemailer.createTransport({
service: 'Gmail',
auth: {
user: 'gmail.user@gmail.com',
pass: 'userpass'
}
});
// NB! No need to recreate the transporter object. You can use
// the same transporter object for all e-mails
// setup e-mail data with unicode symbols
var mailOptions = {
from: 'Fred Foo ✔ <foo@blurdybloop.com>', // sender address
to: 'bar@blurdybloop.com, baz@blurdybloop.com', // list of receivers
subject: 'Hello ✔', // Subject line
text: 'Hello world ✔', // plaintext body
html: '<b>Hello world ✔</b>' // html body
};
// send mail with defined transport object
transporter.sendMail(mailOptions, function(error, info){
if(error){
console.log(error);
}else{
console.log('Message sent: ' + info.response);
}
});
```
See [nodemailer-smtp-transport](https://github.com/andris9/nodemailer-smtp-transport#usage) for SMTP configuration options and [nodemailer-wellknown](https://github.com/andris9/nodemailer-wellknown#supported-services) for preconfigured service names (example uses 'gmail').
> When using default SMTP transport, then you do not need to define transport type explicitly (even though you can), just provide the SMTP options and that's it. For anything else, see the docs of the particular [transport mechanism](#available-transports).
## Setting up
Install with npm
npm install nodemailer
To send e-mails you need a transporter object
```javascript
var transporter = nodemailer.createTransport(transport)
```
Where
* **transporter** is going to be an object that is able to send mail
* **transport** is a transport mechanism. If it is not set [nodemailer-direct-transport](https://github.com/andris9/nodemailer-direct-transport) transport is used. If it is a regular object [nodemailer-smtp-transport](https://github.com/andris9/nodemailer-smtp-transport) is used and the value is passed as SMTP configuration.
> You have to create the transporter object only once. If you already have a transporter object you can use it to send mail as much as you like.
### Examples
#### Use *direct* transport
In this case all e-mails are sent directly to the recipients MX server (using port 25)
```javascript
var nodemailer = require('nodemailer');
var transporter = nodemailer.createTransport();
transporter.sendMail({
from: 'sender@address',
to: 'receiver@address',
subject: 'hello',
text: 'hello world!'
});
```
> Using *direct* transport is not reliable as outgoing port 25 used is often blocked by default. Additionally mail sent from dynamic addresses is often flagged as spam. You should really consider using a SMTP provider.
#### Use the default *SMTP* transport
See SMTP [configuration options here](https://github.com/andris9/nodemailer-smtp-transport#usage)
```javascript
var nodemailer = require('nodemailer');
var transporter = nodemailer.createTransport({
service: 'gmail',
auth: {
user: 'sender@gmail.com',
pass: 'password'
}
});
transporter.sendMail({
from: 'sender@address',
to: 'receiver@address',
subject: 'hello',
text: 'hello world!'
});
```
> Default SMTP transport is not suitable for large volume of e-mails new SMTP connection is established for every mail sent. Use [nodemailer-smtp-pool](https://github.com/andris9/nodemailer-smtp-pool) if you need to send a large amout of e-mails.
>
> For sending bulk mail using Nodemailer see the [recommendations below](#delivering-bulk-mail)
#### Use a transport plugin
See [Available Transports](#available-transports) for known transport plugins but there might be non listed plugins as well.
The following example uses [nodemailer-ses-transport](https://github.com/andris9/nodemailer-ses-transport) (Amazon SES).
```javascript
var nodemailer = require('nodemailer');
var ses = require('nodemailer-ses-transport');
var transporter = nodemailer.createTransport(ses({
accessKeyId: 'AWSACCESSKEY',
secretAccessKey: 'AWS/Secret/key'
}));
transporter.sendMail({
from: 'sender@address',
to: 'receiver@address',
subject: 'hello',
text: 'hello world!'
});
```
## Available Transports
**Built in**
* **[nodemailer-smtp-transport](https://github.com/andris9/nodemailer-smtp-transport)** for sending messages using a SMTP service
* **[nodemailer-direct-transport](https://github.com/andris9/nodemailer-direct-transport)** for sending messages directly to recipients MX servers (zero configuration needed but unreliable)
**Install as dependencies**
* **[nodemailer-smtp-pool](https://github.com/andris9/nodemailer-smtp-pool)** for sending messages to SMTP using pooled connections
* **[nodemailer-ses-transport](https://github.com/andris9/nodemailer-ses-transport)** for sending messages to AWS SES
* **[nodemailer-sendmail-transport](https://github.com/andris9/nodemailer-sendmail-transport)** for piping messages to the *sendmail* command
* **[nodemailer-stub-transport](https://github.com/andris9/nodemailer-stub-transport)** is just for returning messages, most probably for testing purposes
* **[nodemailer-pickup-transport](https://github.com/andris9/nodemailer-pickup-transport)** for storing messages to pickup folders
* **[nodemailer-sendgrid-transport](https://github.com/sendgrid/nodemailer-sendgrid-transport)** for sending messages through SendGrid's Web API
* *add yours* (see transport api documentation [here](#transports))
## Available Plugins
* **[nodemailer-markdown](https://github.com/andris9/nodemailer-markdown)** to use markdown for the content
* **[nodemailer-dkim](https://github.com/andris9/nodemailer-dkim)** to sign messages with DKIM
* **[nodemailer-html-to-text](https://github.com/andris9/nodemailer-html-to-text)** to auto generate plaintext content from html
* **[nodemailer-express-handlebars](https://github.com/yads/nodemailer-express-handlebars)** to auto generate html emails from handelbars/mustache templates
* *add yours* (see plugin api documentation [here](#plugin-api))
## Sending mail
Once you have a transporter object you can send mail
```javascript
transporter.sendMail(data, callback)
```
Where
* **data** defines the mail content (see [e-mail message fields](#e-mail-message-fields) below)
* **callback** is an optional callback function to run once the message is delivered or it failed.
* **err** is the error object if message failed
* **info** includes the result, the exact format depends on the transport mechanism used
* **info.messageId** most transports *should* return the final Message-Id value used with this property
* **info.envelope** includes the envelope object for the message
* **info.accepted** is an array returned by SMTP transports (includes recipient addresses that were accepted by the server)
* **info.rejected** is an array returned by SMTP transports (includes recipient addresses that were rejected by the server)
* **info.pending** is an array returned by Direct SMTP transport. Includes recipient addresses that were temporarily rejected together with the server response
* **response** is a string returned by SMTP transports and includes the last SMTP response from the server
> If the message includes several recipients then the message is considered sent if at least one recipient is accepted
### E-mail message fields
The following are the possible fields of an e-mail message:
- **from** - The e-mail address of the sender. All e-mail addresses can be plain `'sender@server.com'` or formatted `'Sender Name <sender@server.com>'`, see [here](#address-formatting) for details
- **sender** - An e-mail address that will appear on the *Sender:* field
- **to** - Comma separated list or an array of recipients e-mail addresses that will appear on the *To:* field
- **cc** - Comma separated list or an array of recipients e-mail addresses that will appear on the *Cc:* field
- **bcc** - Comma separated list or an array of recipients e-mail addresses that will appear on the *Bcc:* field
- **replyTo** - An e-mail address that will appear on the *Reply-To:* field
- **inReplyTo** - The message-id this message is replying
- **references** - Message-id list (an array or space separated string)
- **subject** - The subject of the e-mail
- **text** - The plaintext version of the message as an Unicode string, Buffer, Stream or an object *{path: '...'}*
- **html** - The HTML version of the message as an Unicode string, Buffer, Stream or an object *{path: '...'}*
- **headers** - An object or array of additional header fields (e.g. *{"X-Key-Name": "key value"}* or *[{key: "X-Key-Name", value: "val1"}, {key: "X-Key-Name", value: "val2"}]*)
- **attachments** - An array of attachment objects (see [below](#attachments) for details)
- **alternatives** - An array of alternative text contents (in addition to text and html parts) (see [below](#alternatives) for details)
- **envelope** - optional SMTP envelope, if auto generated envelope is not suitable (see [below](#smtp-envelope) for details)
- **messageId** - optional Message-Id value, random value will be generated if not set
- **date** - optional Date value, current UTC string will be used if not set
- **encoding** - optional transfer encoding for the textual parts (defaults to 'quoted-printable')
All text fields (e-mail addresses, plaintext body, html body) use UTF-8 as the encoding.
Attachments are streamed as binary.
### Attachments
Attachment object consists of the following properties:
* **filename** - filename to be reported as the name of the attached file, use of unicode is allowed
* **cid** - optional content id for using inline images in HTML message source
* **content** - String, Buffer or a Stream contents for the attachment
* **encoding** - If set and `content` is string, then encodes the content to a Buffer using the specified encoding. Example values: `base64`, `hex`, 'binary' etc. Useful if you want to use binary attachments in a JSON formatted e-mail object.
* **path** - path to a file or an URL (data uris are allowed as well) if you want to stream the file instead of including it (better for larger attachments)
* **contentType** - optional content type for the attachment, if not set will be derived from the `filename` property
* **contentDisposition** - optional content disposition type for the attachment, defaults to 'attachment'
Attachments can be added as many as you want.
```javascript
var mailOptions = {
...
attachments: [
{ // utf-8 string as an attachment
filename: 'text1.txt',
content: 'hello world!'
},
{ // binary buffer as an attachment
filename: 'text2.txt',
content: new Buffer('hello world!','utf-8')
},
{ // file on disk as an attachment
filename: 'text3.txt',
path: '/path/to/file.txt' // stream this file
},
{ // filename and content type is derived from path
path: '/path/to/file.txt'
},
{ // stream as an attachment
filename: 'text4.txt',
content: fs.createReadStream('file.txt')
},
{ // define custom content type for the attachment
filename: 'text.bin',
content: 'hello world!',
contentType: 'text/plain'
},
{ // use URL as an attachment
filename: 'license.txt',
path: 'https://raw.github.com/andris9/Nodemailer/master/LICENSE'
},
{ // encoded string as an attachment
filename: 'text1.txt',
content: 'aGVsbG8gd29ybGQh',
encoding: 'base64'
},
{ // data uri as an attachment
path: 'data:text/plain;base64,aGVsbG8gd29ybGQ='
}
]
}
```
### Alternatives
In addition to text and HTML, any kind of data can be inserted as an alternative content of the main body - for example a word processing document with the same text as in the HTML field. It is the job of the e-mail client to select and show the best fitting alternative to the reader. Usually this field is used for calendar events and such.
Alternative objects use the same options as [attachment objects](#attachments). The difference between an attachment and an alternative is the fact that attachments are placed into *multipart/mixed* or *multipart/related* parts of the message white alternatives are placed into *multipart/alternative* part.
**Usage example:**
```javascript
var mailOptions = {
...
html: '<b>Hello world!</b>',
alternatives: [
{
contentType: 'text/x-web-markdown',
content: '**Hello world!**'
}
]
}
```
Alternatives can be added as many as you want.
### Address Formatting
All the e-mail addresses can be plain e-mail addresses
```
foobar@blurdybloop.com
```
or with formatted name (includes unicode support)
```
"Ноде Майлер" <foobar@blurdybloop.com>
```
> Notice that all address fields (even `from`) are comma separated lists, so if you want to use a comma in the name part, make sure you enclose the name in double quotes: `"Майлер, Ноде" <foobar@blurdybloop.com>`
or as an address object (in this case you do not need to worry about the formatting, no need to use quotes etc.)
```
{
name: 'Майлер, Ноде',
address: 'foobar@blurdybloop.com'
}
```
All address fields accept comma separated list of e-mails or an array of
e-mails or an array of comma separated list of e-mails or address objects - use it as you like.
Formatting can be mixed.
```
...,
to: 'foobar@blurdybloop.com, "Ноде Майлер" <bar@blurdybloop.com>, "Name, User" <baz@blurdybloop.com>',
cc: ['foobar@blurdybloop.com', '"Ноде Майлер" <bar@blurdybloop.com>, "Name, User" <baz@blurdybloop.com>'],
bcc: ['foobar@blurdybloop.com', {name: 'Майлер, Ноде', address: 'foobar@blurdybloop.com'}]
...
```
You can even use unicode domains, these are automatically converted to punycode
```
'"Unicode Domain" <info@müriaad-polüteism.info>'
```
### SMTP envelope
SMTP envelope is usually auto generated from `from`, `to`, `cc` and `bcc` fields but
if for some reason you want to specify it yourself, you can do it with `envelope` property.
`envelope` is an object with the following params: `from`, `to`, `cc` and `bcc` just like
with regular mail options. You can also use the regular address format, unicode domains etc.
```javascript
mailOptions = {
...,
from: 'mailer@kreata.ee',
to: 'daemon@kreata.ee',
envelope: {
from: 'Daemon <deamon@kreata.ee>',
to: 'mailer@kreata.ee, Mailer <mailer2@kreata.ee>'
}
}
```
> Not all transports can use the `envelope` object, for example SES ignores it and uses the data from the From:, To: etc. headers.
### Using Embedded Images
Attachments can be used as embedded images in the HTML body. To use this feature, you need to set additional property of the attachment - `cid` (unique identifier of the file) which is a reference to the attachment file. The same `cid` value must be used as the image URL in HTML (using `cid:` as the URL protocol, see example below).
**NB!** the cid value should be as unique as possible!
```javascript
var mailOptions = {
...
html: 'Embedded image: <img src="cid:unique@kreata.ee"/>',
attachments: [{
filename: 'image.png',
path: '/path/to/file',
cid: 'unique@kreata.ee' //same cid value as in the html img src
}]
}
```
## Plugin system
There are 3 stages a plugin can hook to
1. **'compile'** is the step where e-mail data is set but nothing has been done with it yet. At this step you can modify mail options, for example modify `html` content, add new headers etc. Example: [nodemailer-markdown](https://github.com/andris9/nodemailer-markdown) that allows you to use `markdown` source instead of `text` and `html`.
2. **'stream'** is the step where message tree has been compiled and is ready to be streamed. At this step you can modify the generated MIME tree or add a transform stream that the generated raw e-mail will be piped through before passed to the transport object. Example: [nodemailer-dkim](https://github.com/andris9/nodemailer-dkim) that adds DKIM signature to the generated message.
3. **Transport** step where the raw e-mail is streamed to destination. Example: [nodemailer-smtp-transport](https://github.com/andris9/nodemailer-smtp-transport) that streams the message to a SMTP server.
### Including plugins
'compile' and 'stream' plugins can be attached with `use(plugin)` method
```javascript
transporter.use(step, pluginFunc)
```
Where
* **transporter** is a transport object created with `createTransport`
* **step** is a string, either 'compile' or 'stream' thatd defines when the plugin should be hooked
* **pluginFunc** is a function that takes two arguments: the mail object and a callback function
## Plugin API
All plugins (including transports) get two arguments, the mail object and a callback function.
Mail object that is passed to the plugin function as the first argument is an object with the following properties:
* **data** is the mail data object that is passed to the `sendMail` method
* **message** is the [BuildMail](https://github.com/andris9/buildmail) object of the message. This is available for the 'stream' step and for the transport but not for 'compile'.
* **resolveContent** is a helper function for converting Nodemailer compatible stream objects into Strings or Buffers
### resolveContent()
If your plugin needs to get the full value of a param, for example the String value for the `html` content, you can use `resolveContent()` to convert Nodemailer
compatible content objects to Strings or Buffers.
```javascript
data.resolveContent(obj, key, callback)
```
Where
* **obj** is an object that has a property you want to convert to a String or a Buffer
* **key** is the name of the property you want to convert
* **callback** is the callback function with (err, value) where `value` is either a String or Buffer, depending on the input
**Example**
```javascript
function plugin(mail, callback){
// if mail.data.html is a file or an url, it is returned as a Buffer
mail.resolveContent(mail.data, 'html', function(err, html){
if(err){
return callback(err);
}
console.log('HTML contents: %s', html.toString());
callback();
});
};
```
### 'compile'
Compile step plugins get only the `mail.data` object but not `mail.message` in the `mail` argument of the plugin function. If you need to access the `mail.message` as well use 'stream' step instead.
This is really straightforward, your plugin can modify the `mail.data` object at will and once everything is finished run the callback function. If the callback gets an error object as an argument, then the process is terminated and the error is returned to the `sendMail` callback.
**Example**
The following plugin checks if `text` value is set and if not converts `html` value to `text` by removing all html tags.
```javascript
transporter.use('compile', function(mail, callback){
if(!mail.text && mail.html){
mail.text = mail.html.replace(/<[^>]*>/g, ' ');
}
callback();
});
```
See [plugin-compile.js](examples/plugin-compile.js) for a working example.
### 'stream'
Streaming step is invoked once the message structure is built and ready to be streamed to the transport. Plugin function still gets `mail.data` but it is included just for the reference, modifying it should not change anything (unless the transport requires something from the `mail.data`, for example `mail.data.envelope`).
You can modify the `mail.message` object as you like, the message is not yet streaming anything (message starts streaming when the transport calls `mail.message.createReadStream()`).
In most cases you might be interested in the [message.transform()](https://github.com/andris9/buildmail#transform) method for applying transform streams to the raw message.
**Example**
The following plugin replaces all tabs with spaces in the raw message.
```javascript
var transformer = new (require('stream').Transform)();
transformer._transform = function(chunk, encoding, done) {
// replace all tabs with spaces in the stream chunk
for(var i = 0; i < chunk.length; i++){
if(chunk[i] === 0x09){
chunk[i] = 0x20;
}
}
this.push(chunk);
done();
};
transporter.use('stream', function(mail, callback){
// apply output transformer to the raw message stream
mail.message.transform(transformer);
callback();
});
```
See [plugin-stream.js](examples/plugin-stream.js) for a working example.
Additionally you might be interested in the [message.getAddresses()](https://github.com/andris9/buildmail#getaddresses) method that returns the contents for all address fields as structured objects.
**Example**
The following plugin prints address information to console.
```javascript
transporter.use('stream', function(mail, callback){
var addresses = mail.message.getAddresses();
console.log('From: %s', JSON.stringify(addresses.from));
console.log('To: %s', JSON.stringify(addresses.to));
console.log('Cc: %s', JSON.stringify(addresses.cc));
console.log('Bcc: %s', JSON.stringify(addresses.bcc));
callback();
});
```
### Transports
Transports are objects that have a method `send` and properies `name` and `version`. Additionally, if the transport object is an Event Emitter, 'log' events are piped through Nodemailer. A transport object is passed to the `nodemailer.createTransport(transport)` method to create the transporter object.
**`transport.name`**
This is the name of the transport object. For example 'SMTP' or 'SES' etc.
```javascript
transport.name = require('package.json').name;
```
**`transport.version`**
This should be the transport module version. For example '0.1.0'.
```javascript
transport.version = require('package.json').version;
```
**`transport.send(mail, callback)`**
This is the method that actually sends out e-mails. The method is basically the same as 'stream' plugin functions. It gets two arguments: `mail` and a callback. To start streaming the message, create the stream with `mail.message.createReadStream()`
Callback function should return an `info` object as the second arugment. This info object should contain `messageId` value with the Message-Id header (without the surrounding &lt; &gt; brackets)
The following example pipes the raw stream to the console.
```javascript
transport.send = function(mail, callback){
var input = mail.message.createReadStream();
var messageId = (mail.message.getHeader('message-id') || '').replace(/[<>\s]/g, '');
input.pipe(process.stdout);
input.on('end', function() {
callback(null, {
messageId: messageId
});
});
};
```
**`transport.close(args*)`**
If your transport needs to be closed explicitly, you can implement a `close` method.
This is purely optional feature and only makes sense in special contexts (eg. closing a SMTP pool).
Once you have a transport object, you can create a mail transporter out of it.
```
var nodemailer = require('nodemailer');
var transport = require('some-transport-method');
var transporter = nodemailer.createTransport(transport);
transporter.sendMail({mail data});
```
See [minimal-transport.js](examples/minimal-transport.js) for a working example.
## Using Gmail
Even though Gmail is the fastest way to get started with sending emails, it is by no means a preferable solution unless you are using OAuth2 authentication. Gmail expects the user to be an actual user not a robot so it runs a lot of heuristics for every login attempt and blocks anything that looks suspicious to defend the user from account hijacking attempts. For example you might run into trouble if your server is in another geographical location everything works in your dev machine but messages are blocked in production.
Additionally Gmail has came up with the concept of ['less secure'](https://support.google.com/accounts/answer/6010255?hl=en) apps which is basically anyone who uses plain password to login to Gmail, so you might end up in a situation where one username can send (support for 'less secure' apps is enabled) but other is blocked (support for 'less secure' apps is disabled).
To prevent having login issues you should either use XOAUTH2 (see details [here](https://github.com/andris9/nodemailer-smtp-transport#authentication)) or use another provider and preferably a dedicated one like [Mailgun](http://www.mailgun.com/) or [SendGrid](http://mbsy.co/sendgrid/12237825) or any other. Usually these providers have free plans available that are compareable to the daily sending limits of Gmail. Gmail has a limit of 500 recipients a day (a message with one *To* and one *Cc* address counts as two messages since it has two recipients) for @gmail.com addresses and 2000 for Google Apps customers, larger SMTP providers usually offer about 200-300 recipients a day for free.
## Delivering Bulk Mail
Here are some tips how to handle bulk mail, for example if you need to send 10 million messages at once (originally published as a [blog post](http://www.andrisreinman.com/delivering-bulk-mail-with-nodemailer/)).
1. **Use a dedicated SMTP provider** like [SendGrid](http://mbsy.co/sendgrid/12237825) or [Mailgun](http://www.mailgun.com/) or any other. Do not use services that offer SMTP as a sideline or for free (thats Gmail or the SMTP of your homepage hosting company) to send bulk mail you'll hit all the hard limits immediatelly or get labelled as spam. Basically you get what you pay for and if you pay zero then your deliverability is near zero as well. E-mail might seem free but it is only free to a certain amount and that amount certainly does not include 10 million e-mails in a short period of time.
2. **Use a dedicated queue manager,** for example [RabbitMQ](http://www.rabbitmq.com/) for queueing the e-mails. Nodemailer creates a callback function with related scopes etc. for every message so it might be hard on memory if you pile up the data for 10 million messages at once. Better to take the data from a queue when there's a free spot in the connection pool (previously sent message returns its callback).
3. **Use [nodemailer-smtp-pool](https://github.com/andris9/nodemailer-smtp-pool) transport.** You do not want to have the overhead of creating a new connection and doing the SMTP handshake dance for every single e-mail. Pooled connections make it possible to bring this overhead to a minimum.
4. **Set `maxMessages` option to `Infinity`** for the nodemailer-smtp-pool transport. Dedicated SMTP providers happily accept all your e-mails as long you are paying for these, so no need to disconnect in the middle if everything is going smoothly. The default value is 100 which means that once a connection is used to send 100 messages it is removed from the pool and a new connection is created.
5. **Set `maxConnections` to whatever your system can handle.** There might be limits to this on the receiving side, so do not set it to `Infinity`, even 20 is probably much better than the default 5. A larger number means a larger amount of messages are sent in parallel.
6. **Use file paths not URLs for attachments.** If you are reading the same file from the disk several million times, the contents for the file probably get cached somewhere between your app and the physical hard disk, so you get your files back quicker (assuming you send the same attachment to all recipients). There is nothing like this for URLs every new message makes a fresh HTTP fetch to receive the file from the server.
7. If the SMTP service accepts HTTP API as well you still might prefer SMTP and not the HTTP API as HTTP introduces additional overhead. You probably want to use HTTP over SMTP if the HTTP API is bulk aware you send a message template and the list of 10 million recipients and the service compiles this information into e-mails itself, you can't beat this with SMTP.
## License
**Nodemailer** is licensed under [MIT license](https://github.com/andris9/Nodemailer/blob/master/LICENSE). Basically you can do whatever you want to with it
----
The Nodemailer logo was designed by [Sven Kristjansen](https://www.behance.net/kristjansen).

View File

@@ -0,0 +1,20 @@
{
"indent": 4,
"node": true,
"globalstrict": true,
"evil": true,
"unused": true,
"undef": true,
"newcap": true,
"esnext": true,
"curly": true,
"eqeqeq": true,
"expr": true,
"predef": [
"describe",
"it",
"beforeEach",
"afterEach"
]
}

View File

@@ -0,0 +1,4 @@
node_modules/
npm-debug.log
.DS_Store
examples

View File

@@ -0,0 +1,16 @@
language: node_js
node_js:
- "0.10"
- 0.12
- iojs
before_install:
- npm install -g grunt-cli
notifications:
email:
- andris@kreata.ee
webhooks:
urls:
- https://webhooks.gitter.im/e/0ed18fd9b3e529b3c2cc
on_success: change # options: [always|never|change] default: always
on_failure: always # options: [always|never|change] default: always
on_start: false # default: false

View File

@@ -0,0 +1,30 @@
# Changelog
## v1.2.4 2015-04-15
* Only use format=flowed with text/plain and not with other text/* stuff
## v1.2.3 2015-04-15
* Maintenace release, bumped dependency versions
## v1.2.2 2015-04-03
* Maintenace release, bumped libqp which resolves an endless loop in case of a trailing &lt;CR&gt;
## v1.2.1 2014-09-12
* Maintenace release, fixed a test and bumped dependency versions
## v1.2.0 2014-09-12
* Allow functions as transform plugins (the function should create a stream object)
## v1.1.1 2014-08-21
* Bumped libmime version to handle filenames with spaces properly. Short ascii only names with spaces were left unquoted.
## v1.1.0 2014-07-24
* Added new method `getAddresses` that returns all used addresses as a structured object
* Changed version number scheme. Major is now 1 but it is not backwards incopatible with 0.x, as only the scheme changed but not the content

View File

@@ -0,0 +1,29 @@
module.exports = function(grunt) {
'use strict';
// Project configuration.
grunt.initConfig({
jshint: {
all: ['src/*.js', 'test/*.js'],
options: {
jshintrc: '.jshintrc'
}
},
mochaTest: {
all: {
options: {
reporter: 'spec'
},
src: ['test/*-unit.js']
}
}
});
// Load the plugin(s)
grunt.loadNpmTasks('grunt-contrib-jshint');
grunt.loadNpmTasks('grunt-mocha-test');
// Tasks
grunt.registerTask('default', ['jshint', 'mochaTest']);
};

19
node_modules/nodemailer/node_modules/buildmail/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,19 @@
Copyright (c) 2014-2015 Andris Reinman
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@@ -0,0 +1,554 @@
# buildmail
Low level rfc2822 message composer that streams output. Define your own mime tree, no magic included.
Ported from [MailBuild](https://github.com/whiteout-io/mailbuild) of the [emailjs.org](http://emailjs.org/) project. This port uses similar API but is for Node only and streams the output.
## Usage
Install with npm
npm install buildmail
Require in your scripts
```javascript
var BuildMail = require('buildmail');
```
## API
Create a new `BuildMail` object with
```javascript
var builder = new BuildMail(contentType [, options]);
```
Where
* **contentType** - define the content type for created node. Can be left blank for attachments (content type derived from `filename` option if available)
* **options** - an optional options object
* **filename** - *String* filename for an attachment node
* **baseBoundary** - *String* shared part of the unique multipart boundary (generated randomly if not set)
* **keepBcc** - *Boolean* If true keep the Bcc value in generated headers (default is to remove it)
## Methods
The same methods apply to the root node created with `new BuildMail()` and to any child nodes.
### createChild
Creates and appends a child node to the node object
```javascript
node.createChild(contentType, options)
```
The same arguments apply as with `new BuildMail()`. Created node object is returned.
**Example**
```javascript
new BuildMail('multipart/mixed').
createChild('multipart/related').
createChild('text/plain');
```
Generates the following mime tree:
```
multipart/mixed
↳ multipart/related
↳ text/plain
```
### appendChild
Appends an existing child node to the node object. Removes the node from an existing tree if needed.
```javascript
node.appendChild(childNode)
```
Where
* **childNode** - child node to be appended
Method returns appended child node.
**Example**
```javascript
var childNode = new BuildMail('text/plain'),
rootNode = new BuildMail('multipart/mixed');
rootnode.appendChild(childNode);
```
Generates the following mime tree:
```
multipart/mixed
↳ text/plain
```
## replace
Replaces current node with another node
```javascript
node.replace(replacementNode)
```
Where
* **replacementNode** - node to replace the current node with
Method returns replacement node.
**Example**
```javascript
var rootNode = new BuildMail('multipart/mixed'),
childNode = rootNode.createChild('text/plain');
childNode.replace(new BuildMail('text/html'));
```
Generates the following mime tree:
```
multipart/mixed
↳ text/html
```
## remove
Removes current node from the mime tree. Does not make a lot of sense for a root node.
```javascript
node.remove();
```
Method returns removed node.
**Example**
```javascript
var rootNode = new BuildMail('multipart/mixed'),
childNode = rootNode.createChild('text/plain');
childNode.remove();
```
Generates the following mime tree:
```
multipart/mixed
```
## setHeader
Sets a header value. If the value for selected key exists, it is overwritten.
You can set multiple values as well by using `[{key:'', value:''}]` or
`{key: 'value'}` structures as the first argument.
```javascript
node.setHeader(key, value);
```
Where
* **key** - *String|Array|Object* Header key or a list of key value pairs
* **value** - *String* Header value
Method returns current node.
**Example**
```javascript
new BuildMail('text/plain').
setHeader('content-disposition', 'inline').
setHeader({
'content-transfer-encoding': '7bit'
}).
setHeader([
{key: 'message-id', value: 'abcde'}
```
Generates the following header:
```
Content-type: text/plain
Content-Disposition: inline
Content-Transfer-Encoding: 7bit
Message-Id: <abcde>
```
## addHeader
Adds a header value. If the value for selected key exists, the value is appended
as a new field and old one is not touched.
You can set multiple values as well by using `[{key:'', value:''}]` or
`{key: 'value'}` structures as the first argument.
```javascript
node.addHeader(key, value);
```
Where
* **key** - *String|Array|Object* Header key or a list of key value pairs
* **value** - *String* Header value
Method returns current node.
**Example**
```javascript
new BuildMail('text/plain').
addHeader('X-Spam', '1').
setHeader({
'x-spam': '2'
}).
setHeader([
{key: 'x-spam', value: '3'}
]);
```
Generates the following header:
```
Content-type: text/plain
X-Spam: 1
X-Spam: 2
X-Spam: 3
```
## getHeader
Retrieves the first mathcing value of a selected key
```javascript
node.getHeader(key)
```
Where
* **key** - *String* Key to search for
**Example**
```javascript
new BuildMail('text/plain').getHeader('content-type'); // text/plain
```
## buildHeaders
Builds the current header info into a header block that can be used in an e-mail
```javascript
var headers = node.buildHeaders()
```
**Example**
```javascript
new BuildMail('text/plain').
addHeader('X-Spam', '1').
setHeader({
'x-spam': '2'
}).
setHeader([
{key: 'x-spam', value: '3'}
]).buildHeaders();
```
returns the following String
```
Content-Type: text/plain
X-Spam: 3
Date: Sat, 21 Jun 2014 10:52:44 +0000
Message-Id: <1403347964894-790a5296-0eb7c7c7-6440334f@localhost>
MIME-Version: 1.0
```
If the node is the root node, then `Date` and `Message-Id` values are generated automatically if missing
## setContent
Sets body content for current node. If the value is a string and Content-Type is text/* then charset is set automatically.
If the value is a Buffer or a Stream you need to specify the charset yourself.
```javascript
node.setContent(body)
```
Where
* **body** - *String|Buffer|Stream|Object* body content
If the value is an object, it should include one of the following properties
* **path** - path to a file that will be used as the content
* **href** - URL that will be used as the content
**Example**
```javascript
new BuildMail('text/plain').setContent('Hello world!');
new BuildMail('text/plain; charset=utf-8').setContent(fs.createReadStream('message.txt'));
```
## build
Builds the rfc2822 message from the current node. If this is a root node, mandatory header fields are set if missing (Date, Message-Id, MIME-Version)
```javascript
node.build(callback)
```
Callback returns the rfc2822 message as a Buffer
**Example**
```javascript
new BuildMail('text/plain').setContent('Hello world!').build(function(err, mail){
console.log(mail.toString('ascii'));
});
```
Returns the following string:
```
Content-type: text/plain
Date: <current datetime>
Message-Id: <generated value>
MIME-Version: 1.0
Hello world!
```
## createReadStream
If you manage large attachments you probably do not want to generate but stream the message.
```javascript
var stream = node.createReadStream(options)
```
Where
* **options** - *Object* optional Stream options (ie. `highWaterMark`)
**Example**
```javascript
var message = new BuildMail();
message.addHeader({
from: 'From <from@example.com>',
to: 'receiver1@example.com',
cc: 'receiver2@example.com'
});
message.setContent(fs.createReadStream('message.txt'));
message.createReadStream().pipe(fs.createWriteStream('message.eml'));
```
## transform
If you want to modify the created stream, you can add transform streams that the output will be piped through.
```javascript
node.transform(transformStream)
```
Where
* **transformStream** - *Stream* or *Function* Transform stream that the output will go through before returing with `createReadStream`. If the value is a function the function should return a transform stream object when called.
**Example**
```javascript
var PassThrough = require('stream').PassThrough;
var message = new BuildMail();
message.addHeader({
from: 'From <from@example.com>',
to: 'receiver1@example.com',
cc: 'receiver2@example.com'
});
message.setContent(fs.createReadStream('message.txt'));
message.transform(new PassThrough()); // add a stream that the output will be piped through
message.createReadStream().pipe(fs.createWriteStream('message.eml'));
```
## setEnvelope
Set envelope object to use. If one is not set, it is generated based ong the headers.
```javascript
node.setEnvelope(envelope)
```
Where
* **envelope** is an envelope object in the form of `{from:'address', to: ['addresses']}`
## getEnvelope
Generates a SMTP envelope object. Makes sense only for root node.
```javascript
var envelope = node.generateEnvelope()
```
Method returns the envelope in the form of `{from:'address', to: ['addresses']}`
**Example**
```javascript
new BuildMail().
addHeader({
from: 'From <from@example.com>',
to: 'receiver1@example.com',
cc: 'receiver2@example.com'
}).
getEnvelope();
```
Returns the following object:
```json
{
'from': 'from@example.com',
'to': ['receiver1@example.com', 'receiver2@example.com']
}
```
## getAddresses
Returns an address container object. Includes all parsed addresses from From, Sender, To, Cc, Bcc and Reply-To fields.
While `getEnvelope()` returns 'from' value as a single address (the first one encountered) then `getAddresses` return all values as arrays, including `from`. Additionally while `getEnvelope` returns only `from` and a combined `to` value then `getAddresses` returns all fields separately.
Possbile return values (all arrays in the form of `[{name:'', address:''}]`):
* **from**
* **sender**
* **'reply-to'**
* **to**
* **cc**
* **bcc**
If no addresses were found for a particular field, the field is not set in the response object.
**Example**
```javascript
new BuildMail().
addHeader({
from: 'From <from@example.com>',
to: '"Receiver" receiver1@example.com',
cc: 'receiver2@example.com'
}).
getAddresses();
```
Returns the following object:
```javascript
{
from: [{
name: 'From',
address: 'from@example.com'
}],
to: [{
name: 'Receiver',
address: 'receiver1@example.com'
}],
cc: [{
name: '',
address: 'receiver2@example.com'
}]
}
```
## Notes
### Addresses
When setting address headers (`From`, `To`, `Cc`, `Bcc`) use of unicode is allowed. If needed
the addresses are converted to punycode automatically.
### Attachments
For attachments you should minimally set `filename` option and `Content-Disposition` header. If filename is specified, you can leave content type blank - if content type is not set, it is detected from the filename.
```javascript
new BuildMail('multipart/mixed').
createChild(false, {filename: 'image.png'}).
setHeader('Content-Disposition', 'attachment');
```
Obviously you might want to add `Content-Id` header as well if you want to reference this attachment from the HTML content.
### MIME structure
Most probably you only need to deal with the following multipart types when generating messages:
* **multipart/alternative** - includes the same content in different forms (usually text/plain + text/html)
* **multipart/related** - includes main node and related nodes (eg. text/html + referenced attachments). Also requires a `type` parameter that indicates the Content-Type of the *root* element in the node
* **multipart/mixed** - includes other multipart nodes and attachments, or single content node and attachments
**Examples**
One content node and an attachment
```
multipart/mixed
↳ text/plain
↳ image/png
```
Content node with referenced attachment (eg. image with `Content-Type` referenced by `cid:` url in the HTML)
```
multipart/related
↳ text/html
↳ image/png
```
Plaintext and HTML alternatives
```
multipart/alternative
↳ text/html
↳ text/plain
```
One content node with referenced attachment and a regular attachment
```
multipart/mixed
↳ multipart/related
↳ text/plain
↳ image/png
↳ application/x-zip
```
Alternative content with referenced attachment for HTML and a regular attachment
```
multipart/mixed
↳ multipart/alternative
↳ text/plain
↳ multipart/related
↳ text/html
↳ image/png
↳ application/x-zip
```
## License
**MIT**

View File

@@ -0,0 +1,17 @@
{
"indent": 4,
"node": true,
"globalstrict": true,
"evil": true,
"unused": true,
"undef": true,
"newcap": true,
"esnext": true,
"curly": true,
"eqeqeq": true,
"predef": [
"describe",
"it"
]
}

View File

@@ -0,0 +1,3 @@
.travis.yml
test
Gruntfile.js

View File

@@ -0,0 +1,6 @@
# Changelog
## v0.3.2 2015-01-07
* Added changelog
* Allow semicolon (;) as address separator in addition to comma (,). Backport from https://github.com/whiteout-io/addressparser/pull/5

View File

@@ -0,0 +1,16 @@
Copyright (c) 2014 Andris Reinman
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -0,0 +1,64 @@
# addressparser
Parse e-mail address fields. Input can be a single address (`"andris@kreata.ee"`), a formatted address (`"Andris Reinman <andris@kreata.ee>"`), comma separated list of addresses (`"andris@kreata.ee, andris.reinman@kreata.ee"`), an address group (`"disclosed-recipients:andris@kreata.ee;"`) or a mix of all the formats.
In addition to comma the semicolon is treated as the list delimiter as well (except when used in the group syntax), so a value `"andris@kreata.ee; andris.reinman@kreata.ee"` is identical to `"andris@kreata.ee, andris.reinman@kreata.ee"`.
## Installation
Install with npm
```
npm install addressparser
```
## Usage
Include the module
```javascript
var addressparser = require('addressparser');
```
Parse some address strings with `addressparser(field)`
```javascript
var addresses = addressparser('andris <andris@tr.ee>');
console.log(addresses); // [{name: "andris", address:"andris@tr.ee"}]
```
And when using groups
```javascript
addressparser('Composers:"Bach, Sebastian" <sebu@example.com>, mozart@example.com (Mozzie);');
```
the result would be
```
[
{
name: "Composers",
group: [
{
address: "sebu@example.com",
name: "Bach, Sebastian"
},
{
address: "mozart@example.com",
name: "Mozzie"
}
]
}
]
```
> Be prepared though that groups might be nested.
## Notes
This module does not decode any mime-word or punycode encoded strings, it is only a basic parser for parsing the base data, you need to decode the encoded parts later by yourself
## License
**MIT**

View File

@@ -0,0 +1,34 @@
{
"name": "addressparser",
"version": "0.3.2",
"description": "Parse e-mail addresses",
"main": "src/addressparser.js",
"repository": {
"type": "git",
"url": "git+https://github.com/andris9/addressparser.git"
},
"author": {
"name": "Andris Reinman"
},
"license": "MIT",
"scripts": {
"test": "grunt"
},
"devDependencies": {
"chai": "^1.10.0",
"grunt": "^0.4.5",
"grunt-contrib-jshint": "^0.10.0",
"grunt-mocha-test": "^0.12.4",
"mocha": "^2.1.0"
},
"readme": "# addressparser\n\nParse e-mail address fields. Input can be a single address (`\"andris@kreata.ee\"`), a formatted address (`\"Andris Reinman <andris@kreata.ee>\"`), comma separated list of addresses (`\"andris@kreata.ee, andris.reinman@kreata.ee\"`), an address group (`\"disclosed-recipients:andris@kreata.ee;\"`) or a mix of all the formats.\n\nIn addition to comma the semicolon is treated as the list delimiter as well (except when used in the group syntax), so a value `\"andris@kreata.ee; andris.reinman@kreata.ee\"` is identical to `\"andris@kreata.ee, andris.reinman@kreata.ee\"`.\n\n## Installation\n\nInstall with npm\n\n```\nnpm install addressparser\n```\n\n## Usage\n\nInclude the module\n\n```javascript\nvar addressparser = require('addressparser');\n```\n\nParse some address strings with `addressparser(field)`\n\n```javascript\nvar addresses = addressparser('andris <andris@tr.ee>');\nconsole.log(addresses); // [{name: \"andris\", address:\"andris@tr.ee\"}]\n```\n\nAnd when using groups\n\n```javascript\naddressparser('Composers:\"Bach, Sebastian\" <sebu@example.com>, mozart@example.com (Mozzie);');\n```\n\nthe result would be\n\n```\n[\n {\n name: \"Composers\",\n group: [\n {\n address: \"sebu@example.com\",\n name: \"Bach, Sebastian\"\n },\n {\n address: \"mozart@example.com\",\n name: \"Mozzie\"\n }\n ]\n }\n]\n```\n\n> Be prepared though that groups might be nested.\n\n## Notes\n\nThis module does not decode any mime-word or punycode encoded strings, it is only a basic parser for parsing the base data, you need to decode the encoded parts later by yourself\n\n## License\n\n**MIT**",
"readmeFilename": "README.md",
"bugs": {
"url": "https://github.com/andris9/addressparser/issues"
},
"homepage": "https://github.com/andris9/addressparser#readme",
"_id": "addressparser@0.3.2",
"_shasum": "59873f35e8fcf6c7361c10239261d76e15348bb2",
"_resolved": "https://registry.npmjs.org/addressparser/-/addressparser-0.3.2.tgz",
"_from": "addressparser@>=0.3.2 <0.4.0"
}

View File

@@ -0,0 +1,286 @@
'use strict';
// expose to the world
module.exports = addressparser;
/**
* Parses structured e-mail addresses from an address field
*
* Example:
*
* 'Name <address@domain>'
*
* will be converted to
*
* [{name: 'Name', address: 'address@domain'}]
*
* @param {String} str Address field
* @return {Array} An array of address objects
*/
function addressparser(str) {
var tokenizer = new Tokenizer(str),
tokens = tokenizer.tokenize();
var addresses = [],
address = [],
parsedAddresses = [];
tokens.forEach(function(token) {
if (token.type === 'operator' && (token.value === ',' || token.value === ';')) {
if (address.length) {
addresses.push(address);
}
address = [];
} else {
address.push(token);
}
});
if (address.length) {
addresses.push(address);
}
addresses.forEach(function(address) {
address = _handleAddress(address);
if (address.length) {
parsedAddresses = parsedAddresses.concat(address);
}
});
return parsedAddresses;
}
/**
* Converts tokens for a single address into an address object
*
* @param {Array} tokens Tokens object
* @return {Object} Address object
*/
function _handleAddress(tokens) {
var token,
isGroup = false,
state = 'text',
address,
addresses = [],
data = {
address: [],
comment: [],
group: [],
text: []
},
i, len;
// Filter out <addresses>, (comments) and regular text
for (i = 0, len = tokens.length; i < len; i++) {
token = tokens[i];
if (token.type === 'operator') {
switch (token.value) {
case '<':
state = 'address';
break;
case '(':
state = 'comment';
break;
case ':':
state = 'group';
isGroup = true;
break;
default:
state = 'text';
}
} else {
if (token.value) {
data[state].push(token.value);
}
}
}
// If there is no text but a comment, replace the two
if (!data.text.length && data.comment.length) {
data.text = data.comment;
data.comment = [];
}
if (isGroup) {
// http://tools.ietf.org/html/rfc2822#appendix-A.1.3
data.text = data.text.join(' ');
addresses.push({
name: data.text || (address && address.name),
group: data.group.length ? addressparser(data.group.join(',')) : []
});
} else {
// If no address was found, try to detect one from regular text
if (!data.address.length && data.text.length) {
for (i = data.text.length - 1; i >= 0; i--) {
if (data.text[i].match(/^[^@\s]+@[^@\s]+$/)) {
data.address = data.text.splice(i, 1);
break;
}
}
var _regexHandler = function(address) {
if (!data.address.length) {
data.address = [address.trim()];
return ' ';
} else {
return address;
}
};
// still no address
if (!data.address.length) {
for (i = data.text.length - 1; i >= 0; i--) {
data.text[i] = data.text[i].replace(/\s*\b[^@\s]+@[^@\s]+\b\s*/, _regexHandler).trim();
if (data.address.length) {
break;
}
}
}
}
// If there's still is no text but a comment exixts, replace the two
if (!data.text.length && data.comment.length) {
data.text = data.comment;
data.comment = [];
}
// Keep only the first address occurence, push others to regular text
if (data.address.length > 1) {
data.text = data.text.concat(data.address.splice(1));
}
// Join values with spaces
data.text = data.text.join(' ');
data.address = data.address.join(' ');
if (!data.address && isGroup) {
return [];
} else {
address = {
address: data.address || data.text || '',
name: data.text || data.address || ''
};
if (address.address === address.name) {
if ((address.address || '').match(/@/)) {
address.name = '';
} else {
address.address = '';
}
}
addresses.push(address);
}
}
return addresses;
}
/**
* Creates a Tokenizer object for tokenizing address field strings
*
* @constructor
* @param {String} str Address field string
*/
function Tokenizer(str) {
this.str = (str || '').toString();
this.operatorCurrent = '';
this.operatorExpecting = '';
this.node = null;
this.escaped = false;
this.list = [];
}
/**
* Operator tokens and which tokens are expected to end the sequence
*/
Tokenizer.prototype.operators = {
'"': '"',
'(': ')',
'<': '>',
',': '',
':': ';',
// Semicolons are not a legal delimiter per the RFC2822 grammar other
// than for terminating a group, but they are also not valid for any
// other use in this context. Given that some mail clients have
// historically allowed the semicolon as a delimiter equivalent to the
// comma in their UI, it makes sense to treat them the same as a comma
// when used outside of a group.
';': ''
};
/**
* Tokenizes the original input string
*
* @return {Array} An array of operator|text tokens
*/
Tokenizer.prototype.tokenize = function() {
var chr, list = [];
for (var i = 0, len = this.str.length; i < len; i++) {
chr = this.str.charAt(i);
this.checkChar(chr);
}
this.list.forEach(function(node) {
node.value = (node.value || '').toString().trim();
if (node.value) {
list.push(node);
}
});
return list;
};
/**
* Checks if a character is an operator or text and acts accordingly
*
* @param {String} chr Character from the address field
*/
Tokenizer.prototype.checkChar = function(chr) {
if ((chr in this.operators || chr === '\\') && this.escaped) {
this.escaped = false;
} else if (this.operatorExpecting && chr === this.operatorExpecting) {
this.node = {
type: 'operator',
value: chr
};
this.list.push(this.node);
this.node = null;
this.operatorExpecting = '';
this.escaped = false;
return;
} else if (!this.operatorExpecting && chr in this.operators) {
this.node = {
type: 'operator',
value: chr
};
this.list.push(this.node);
this.node = null;
this.operatorExpecting = this.operators[chr];
this.escaped = false;
return;
}
if (!this.escaped && chr === '\\') {
this.escaped = true;
return;
}
if (!this.node) {
this.node = {
type: 'text',
value: ''
};
this.list.push(this.node);
}
if (this.escaped && chr !== '\\') {
this.node.value += '\\';
}
this.node.value += chr;
this.escaped = false;
};

View File

@@ -0,0 +1,4 @@
.travis.yml
.jshintrc
Gruntfile.js
test

View File

@@ -0,0 +1,19 @@
Copyright (c) 2014 Andris Reinman
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@@ -0,0 +1,108 @@
# libbase64
Encode and decode base64 strings.
## Usage
Install with npm
npm install libbase64
Require in your script
```javascript
var libbase64 = require('libbase64');
```
### Encode values
Encode Buffer objects or unicode strings with
libbase64.encode(val) → String
Where
* **val** is a Buffer or an unicode string
**Example**
```javascript
libbase64.encode('jõgeva');
// asO1Z2V2YQ==
```
### Wrap encoded values
To enforce soft line breaks on lines longer than selected amount of characters, use `wrap`
libbase64.wrap(str[, lineLength]) → String
Where
* **str** is a base64 encoded string
* **lineLength** (defaults to 76) is the maximum allowed line length
**Example**
```javascript
libbase64.wrap('asO1Z2V2asO1Z2V2asO1Z2V2YQ==', 10)
// asO1Z2V2as\r\n
// O1Z2V2asO1\r\n
// Z2V2YQ==
```
### Transform Streams
`libbase64` makes it possible to encode and decode streams with `libbase64.Encoder` and `libbase64.Decoder` constructors.
### Encoder Stream
Create new Encoder Stream with
var encoder = new libbase64.Encoder([options])
Where
* **options** is the optional stream options object with an additional option `lineLength` if you want to use any other line length than the default 76 characters (or set to `false` to turn the soft wrapping off completely)
**Example**
The following example script reads in a file, encodes it to base64 and saves the output to a file.
```javascript
var libbase64 = require('libbase64');
var fs = require('fs');
var source = fs.createReadStream('source.txt');
var encoded = fs.createReadStream('encoded.txt');
var encoder = new libbase64.Encoder();
source.pipe(encoder).pipe(encoded);
```
### Decoder Stream
Create new Decoder Stream with
var decoder = new libbase64.Decoder([options])
Where
* **options** is the optional stream options object
**Example**
The following example script reads in a file in base64 encoding, decodes it and saves the output to a file.
```javascript
var libbase64 = require('libbase64');
var fs = require('fs');
var encoded = fs.createReadStream('encoded.txt');
var dest = fs.createReadStream('dest.txt');
var decoder = new libbase64.Decoder();
encoded.pipe(decoder).pipe(dest);
```
## License
**MIT**

View File

@@ -0,0 +1,201 @@
'use strict';
var stream = require('stream');
var util = require('util');
var Transform = stream.Transform;
// expose to the world
module.exports = {
encode: encode,
decode: decode,
wrap: wrap,
Encoder: Encoder,
Decoder: Decoder
};
/**
* Encodes a Buffer into a base64 encoded string
*
* @param {Buffer} buffer Buffer to convert
* @returns {String} base64 encoded string
*/
function encode(buffer) {
if (typeof buffer === 'string') {
buffer = new Buffer(buffer, 'utf-8');
}
return buffer.toString('base64');
}
/**
* Decodes a base64 encoded string to a Buffer object
*
* @param {String} str base64 encoded string
* @returns {Buffer} Decoded value
*/
function decode(str) {
str = (str || '');
return new Buffer(str, 'base64');
}
/**
* Adds soft line breaks to a base64 string
*
* @param {String} str base64 encoded string that might need line wrapping
* @param {Number} [lineLength=76] Maximum allowed length for a line
* @returns {String} Soft-wrapped base64 encoded string
*/
function wrap(str, lineLength) {
str = (str || '').toString();
lineLength = lineLength || 76;
if (str.length <= lineLength) {
return str;
}
return str.replace(new RegExp('.{' + lineLength + '}', 'g'), '$&\r\n').trim();
}
/**
* Creates a transform stream for encoding data to base64 encoding
*
* @constructor
* @param {Object} options Stream options
* @param {Number} [options.lineLength=76] Maximum lenght for lines, set to false to disable wrapping
*/
function Encoder(options) {
// init Transform
this.options = options || {};
if (this.options.lineLength !== false) {
this.options.lineLength = this.options.lineLength || 76;
}
this._curLine = '';
this._remainingBytes = false;
this.inputBytes = 0;
this.outputBytes = 0;
Transform.call(this, this.options);
}
util.inherits(Encoder, Transform);
Encoder.prototype._transform = function(chunk, encoding, done) {
var b64, _self = this;
if (encoding !== 'buffer') {
chunk = new Buffer(chunk, encoding);
}
if (!chunk || !chunk.length) {
return done();
}
this.inputBytes += chunk.length;
if (this._remainingBytes && this._remainingBytes.length) {
chunk = Buffer.concat([this._remainingBytes, chunk]);
this._remainingBytes = false;
}
if (chunk.length % 3) {
this._remainingBytes = chunk.slice(chunk.length - chunk.length % 3);
chunk = chunk.slice(0, chunk.length - chunk.length % 3);
} else {
this._remainingBytes = false;
}
b64 = this._curLine + encode(chunk);
if (this.options.lineLength) {
b64 = wrap(b64, this.options.lineLength);
b64 = b64.replace(/(^|\n)([^\n]*)$/, function(match, lineBreak, lastLine) {
_self._curLine = lastLine;
return lineBreak;
});
}
if (b64) {
this.outputBytes += b64.length;
this.push(b64);
}
done();
};
Encoder.prototype._flush = function(done) {
if (this._remainingBytes && this._remainingBytes.length) {
this._curLine += encode(this._remainingBytes);
}
if (this._curLine) {
this._curLine = wrap(this._curLine, this.options.lineLength);
this.outputBytes += this._curLine.length;
this.push(this._curLine, 'ascii');
this._curLine = '';
}
done();
};
/**
* Creates a transform stream for decoding base64 encoded strings
*
* @constructor
* @param {Object} options Stream options
*/
function Decoder(options) {
// init Transform
this.options = options || {};
this._curLine = '';
this.inputBytes = 0;
this.outputBytes = 0;
Transform.call(this, this.options);
}
util.inherits(Decoder, Transform);
Decoder.prototype._transform = function(chunk, encoding, done) {
var b64, buf;
chunk = chunk.toString('ascii');
if (!chunk || !chunk.length) {
return done();
}
this.inputBytes += chunk.length;
b64 = (this._curLine + chunk);
this._curLine = '';
b64 = b64.replace(/[^a-zA-Z0-9+\/=]/g, '');
if (b64.length % 4) {
this._curLine = b64.substr(-b64.length % 4);
if (this._curLine.length == b64.length) {
b64 = '';
} else {
b64 = b64.substr(0, this._curLine.length);
}
}
if (b64) {
buf = decode(b64);
this.outputBytes += buf.length;
this.push(buf);
}
done();
};
Decoder.prototype._flush = function(done) {
var b64, buf;
if (this._curLine) {
buf = decode(this._curLine);
this.outputBytes += buf.length;
this.push(buf);
this._curLine = '';
}
done();
};

View File

@@ -0,0 +1,37 @@
{
"name": "libbase64",
"version": "0.1.0",
"description": "Encode and decode base64 encoded strings",
"main": "lib/libbase64.js",
"scripts": {
"test": "grunt"
},
"repository": {
"type": "git",
"url": "git://github.com/andris9/libbase64.git"
},
"keywords": [
"base64",
"mime"
],
"author": {
"name": "Andris Reinman"
},
"license": "MIT",
"bugs": {
"url": "https://github.com/andris9/libbase64/issues"
},
"homepage": "https://github.com/andris9/libbase64",
"devDependencies": {
"chai": "~1.8.1",
"grunt": "~0.4.1",
"grunt-contrib-jshint": "~0.8.0",
"grunt-mocha-test": "~0.10.0"
},
"readme": "# libbase64\n\nEncode and decode base64 strings.\n\n## Usage\n\nInstall with npm\n\n npm install libbase64\n\nRequire in your script\n\n```javascript\nvar libbase64 = require('libbase64');\n```\n\n### Encode values\n\nEncode Buffer objects or unicode strings with\n\n libbase64.encode(val) → String\n\nWhere\n\n * **val** is a Buffer or an unicode string\n\n**Example**\n\n```javascript\nlibbase64.encode('jõgeva');\n// asO1Z2V2YQ==\n```\n\n### Wrap encoded values\n\nTo enforce soft line breaks on lines longer than selected amount of characters, use `wrap`\n\n libbase64.wrap(str[, lineLength]) → String\n\nWhere\n\n * **str** is a base64 encoded string\n * **lineLength** (defaults to 76) is the maximum allowed line length\n\n**Example**\n\n```javascript\nlibbase64.wrap('asO1Z2V2asO1Z2V2asO1Z2V2YQ==', 10)\n// asO1Z2V2as\\r\\n\n// O1Z2V2asO1\\r\\n\n// Z2V2YQ==\n```\n\n### Transform Streams\n\n`libbase64` makes it possible to encode and decode streams with `libbase64.Encoder` and `libbase64.Decoder` constructors.\n\n### Encoder Stream\n\nCreate new Encoder Stream with\n\n var encoder = new libbase64.Encoder([options])\n\nWhere\n\n * **options** is the optional stream options object with an additional option `lineLength` if you want to use any other line length than the default 76 characters (or set to `false` to turn the soft wrapping off completely)\n\n**Example**\n\nThe following example script reads in a file, encodes it to base64 and saves the output to a file.\n\n```javascript\nvar libbase64 = require('libbase64');\nvar fs = require('fs');\nvar source = fs.createReadStream('source.txt');\nvar encoded = fs.createReadStream('encoded.txt');\nvar encoder = new libbase64.Encoder();\n\nsource.pipe(encoder).pipe(encoded);\n```\n\n### Decoder Stream\n\nCreate new Decoder Stream with\n\n var decoder = new libbase64.Decoder([options])\n\nWhere\n\n * **options** is the optional stream options object\n\n**Example**\n\nThe following example script reads in a file in base64 encoding, decodes it and saves the output to a file.\n\n```javascript\nvar libbase64 = require('libbase64');\nvar fs = require('fs');\nvar encoded = fs.createReadStream('encoded.txt');\nvar dest = fs.createReadStream('dest.txt');\nvar decoder = new libbase64.Decoder();\n\nencoded.pipe(decoder).pipe(dest);\n```\n\n## License\n\n**MIT**",
"readmeFilename": "README.md",
"_id": "libbase64@0.1.0",
"_shasum": "62351a839563ac5ff5bd26f12f60e9830bb751e6",
"_resolved": "https://registry.npmjs.org/libbase64/-/libbase64-0.1.0.tgz",
"_from": "libbase64@>=0.1.0 <0.2.0"
}

View File

@@ -0,0 +1,4 @@
.travis.yml
.jshintrc
Gruntfile.js
test

View File

@@ -0,0 +1,19 @@
Copyright (c) 2014 Andris Reinman
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

View File

@@ -0,0 +1,109 @@
# libqp
Encode and decode quoted-printable strings according to [RFC2045](http://tools.ietf.org/html/rfc2045#section-6.7).
## Usage
Install with npm
npm install libqp
Require in your script
```javascript
var libqp = require('libqp');
```
### Encode values
Encode Buffer objects or unicode strings with
libqp.encode(val) → String
Where
* **val** is a Buffer or an unicode string
**Example**
```javascript
libqp.encode('jõgeva');
// j=C3=B5geva
```
### Wrap encoded values
Quoted-Printable encoded lines are limited to 76 characters but `encode` method might return lines longer than the limit.
To enforce soft line breaks on lines longer than 76 (or any other length) characters, use `wrap`
libqp.wrap(str[, lineLength]) → String
Where
* **str** is a Quoted-Printable encoded string
* **lineLength** (defaults to 76) is the maximum allowed line length. Any longer line will be soft wrapped
**Example**
```javascript
libqp.wrap('abc j=C3=B5geva', 10)
// abc j=\r\n
// =C3=B5geva
```
### Transform Streams
`libqp` makes it possible to encode and decode streams with `libqp.Encoder` and `libqp.Decoder` constructors.
### Encoder Stream
Create new Encoder Stream with
var encoder = new libqp.Encoder([options])
Where
* **options** is the optional stream options object with an additional option `lineLength` if you want to use any other line length than the default 76 characters (or set to `false` to turn the soft wrapping off completely)
**Example**
The following example script reads in a file, encodes it to Quoted-Printable and saves the output to a file.
```javascript
var libqp = require('libqp');
var fs = require('fs');
var source = fs.createReadStream('source.txt');
var encoded = fs.createReadStream('encoded.txt');
var encoder = new libqp.Encoder();
source.pipe(encoder).pipe(encoded);
```
### Decoder Stream
Create new Decoder Stream with
var decoder = new libqp.Decoder([options])
Where
* **options** is the optional stream options object
**Example**
The following example script reads in a file in Quoted-Printable encoding, decodes it and saves the output to a file.
```javascript
var libqp = require('libqp');
var fs = require('fs');
var encoded = fs.createReadStream('encoded.txt');
var dest = fs.createReadStream('dest.txt');
var decoder = new libqp.Decoder();
encoded.pipe(decoder).pipe(dest);
```
## License
**MIT**

View File

@@ -0,0 +1,313 @@
'use strict';
var stream = require('stream');
var util = require('util');
var Transform = stream.Transform;
// expose to the world
module.exports = {
encode: encode,
decode: decode,
wrap: wrap,
Encoder: Encoder,
Decoder: Decoder
};
/**
* Encodes a Buffer into a Quoted-Printable encoded string
*
* @param {Buffer} buffer Buffer to convert
* @returns {String} Quoted-Printable encoded string
*/
function encode(buffer) {
if (typeof buffer === 'string') {
buffer = new Buffer(buffer, 'utf-8');
}
// usable characters that do not need encoding
var ranges = [
[0x09],
[0x0A],
[0x0D],
[0x20, 0x3C],
[0x3E, 0x3F],
[0x40, 0x7E]
];
var result = '';
var ord;
for (var i = 0, len = buffer.length; i < len; i++) {
ord = buffer[i];
// if the char is in allowed range, then keep as is, unless it is a ws in the end of a line
if (checkRanges(ord, ranges) && !((ord === 0x20 || ord === 0x09) && (i === len - 1 || buffer[i + 1] === 0x0a || buffer[i + 1] === 0x0d))) {
result += String.fromCharCode(ord);
continue;
}
result += '=' + (ord < 0x10 ? '0' : '') + ord.toString(16).toUpperCase();
}
return result;
}
/**
* Decodes a Quoted-Printable encoded string to a Buffer object
*
* @param {String} str Quoted-Printable encoded string
* @returns {Buffer} Decoded value
*/
function decode(str) {
str = (str || '').toString().
// remove invalid whitespace from the end of lines
replace(/[\t ]+$/gm, '').
// remove soft line breaks
replace(/\=(?:\r?\n|$)/g, '');
var encodedBytesCount = (str.match(/\=[\da-fA-F]{2}/g) || []).length,
bufferLength = str.length - encodedBytesCount * 2,
chr, hex,
buffer = new Buffer(bufferLength),
bufferPos = 0;
for (var i = 0, len = str.length; i < len; i++) {
chr = str.charAt(i);
if (chr === '=' && (hex = str.substr(i + 1, 2)) && /[\da-fA-F]{2}/.test(hex)) {
buffer[bufferPos++] = parseInt(hex, 16);
i += 2;
continue;
}
buffer[bufferPos++] = chr.charCodeAt(0);
}
return buffer;
}
/**
* Adds soft line breaks to a Quoted-Printable string
*
* @param {String} str Quoted-Printable encoded string that might need line wrapping
* @param {Number} [lineLength=76] Maximum allowed length for a line
* @returns {String} Soft-wrapped Quoted-Printable encoded string
*/
function wrap(str, lineLength) {
str = (str || '').toString();
lineLength = lineLength || 76;
if (str.length <= lineLength) {
return str;
}
var pos = 0,
len = str.length,
match, code, line,
lineMargin = Math.floor(lineLength / 3),
result = '';
// insert soft linebreaks where needed
while (pos < len) {
line = str.substr(pos, lineLength);
if ((match = line.match(/\r\n/))) {
line = line.substr(0, match.index + match[0].length);
result += line;
pos += line.length;
continue;
}
if (line.substr(-1) === '\n') {
// nothing to change here
result += line;
pos += line.length;
continue;
} else if ((match = line.substr(-lineMargin).match(/\n.*?$/))) {
// truncate to nearest line break
line = line.substr(0, line.length - (match[0].length - 1));
result += line;
pos += line.length;
continue;
} else if (line.length > lineLength - lineMargin && (match = line.substr(-lineMargin).match(/[ \t\.,!\?][^ \t\.,!\?]*$/))) {
// truncate to nearest space
line = line.substr(0, line.length - (match[0].length - 1));
} else {
if (line.match(/\=[\da-f]{0,2}$/i)) {
// push incomplete encoding sequences to the next line
if ((match = line.match(/\=[\da-f]{0,1}$/i))) {
line = line.substr(0, line.length - match[0].length);
}
// ensure that utf-8 sequences are not split
while (line.length > 3 && line.length < len - pos && !line.match(/^(?:=[\da-f]{2}){1,4}$/i) && (match = line.match(/\=[\da-f]{2}$/ig))) {
code = parseInt(match[0].substr(1, 2), 16);
if (code < 128) {
break;
}
line = line.substr(0, line.length - 3);
if (code >= 0xC0) {
break;
}
}
}
}
if (pos + line.length < len && line.substr(-1) !== '\n') {
if (line.length === lineLength && line.match(/\=[\da-f]{2}$/i)) {
line = line.substr(0, line.length - 3);
} else if (line.length === lineLength) {
line = line.substr(0, line.length - 1);
}
pos += line.length;
line += '=\r\n';
} else {
pos += line.length;
}
result += line;
}
return result;
}
/**
* Helper function to check if a number is inside provided ranges
*
* @param {Number} nr Number to check for
* @param {Array} ranges An Array of allowed values
* @returns {Boolean} True if the value was found inside allowed ranges, false otherwise
*/
function checkRanges(nr, ranges) {
for (var i = ranges.length - 1; i >= 0; i--) {
if (!ranges[i].length) {
continue;
}
if (ranges[i].length === 1 && nr === ranges[i][0]) {
return true;
}
if (ranges[i].length === 2 && nr >= ranges[i][0] && nr <= ranges[i][1]) {
return true;
}
}
return false;
}
/**
* Creates a transform stream for encoding data to Quoted-Printable encoding
*
* @constructor
* @param {Object} options Stream options
* @param {Number} [options.lineLength=76] Maximum lenght for lines, set to false to disable wrapping
*/
function Encoder(options) {
// init Transform
this.options = options || {};
if (this.options.lineLength !== false) {
this.options.lineLength = this.options.lineLength || 76;
}
this._curLine = '';
this.inputBytes = 0;
this.outputBytes = 0;
Transform.call(this, this.options);
}
util.inherits(Encoder, Transform);
Encoder.prototype._transform = function(chunk, encoding, done) {
var qp, _self = this;
if (encoding !== 'buffer') {
chunk = new Buffer(chunk, encoding);
}
if (!chunk || !chunk.length) {
return done();
}
this.inputBytes += chunk.length;
if (this.options.lineLength) {
qp = this._curLine + encode(chunk);
qp = wrap(qp, this.options.lineLength);
qp = qp.replace(/(^|\n)([^\n]*)$/, function(match, lineBreak, lastLine) {
_self._curLine = lastLine;
return lineBreak;
});
if (qp) {
this.outputBytes += qp.length;
this.push(qp);
}
} else {
qp = encode(chunk);
this.outputBytes += qp.length;
this.push(qp, 'ascii');
}
done();
};
Encoder.prototype._flush = function(done) {
if (this._curLine) {
this.outputBytes += this._curLine.length;
this.push(this._curLine, 'ascii');
}
done();
};
/**
* Creates a transform stream for decoding Quoted-Printable encoded strings
*
* @constructor
* @param {Object} options Stream options
*/
function Decoder(options) {
// init Transform
this.options = options || {};
this._curLine = '';
this.inputBytes = 0;
this.outputBytes = 0;
Transform.call(this, this.options);
}
util.inherits(Decoder, Transform);
Decoder.prototype._transform = function(chunk, encoding, done) {
var qp, buf, _self = this;
chunk = chunk.toString('ascii');
if (!chunk || !chunk.length) {
return done();
}
this.inputBytes += chunk.length;
qp = (this._curLine + chunk);
this._curLine = '';
qp = qp.replace(/=[^\n]?$/, function(lastLine) {
_self._curLine = lastLine;
return '';
});
if (qp) {
buf = decode(qp);
this.outputBytes += buf.length;
this.push(buf);
}
done();
};
Decoder.prototype._flush = function(done) {
var qp, buf;
if (this._curLine) {
buf = decode(this._curLine);
this.outputBytes += buf.length;
this.push(buf);
}
done();
};

View File

@@ -0,0 +1,37 @@
{
"name": "libqp",
"version": "1.0.0",
"description": "Encode and decode quoted-printable strings according to rfc2045",
"main": "lib/libqp.js",
"scripts": {
"test": "grunt"
},
"repository": {
"type": "git",
"url": "git://github.com/andris9/libqp.git"
},
"keywords": [
"quoted-printable",
"mime"
],
"author": {
"name": "Andris Reinman"
},
"license": "MIT",
"bugs": {
"url": "https://github.com/andris9/libqp/issues"
},
"homepage": "https://github.com/andris9/libqp",
"devDependencies": {
"chai": "~2.2.0",
"grunt": "~0.4.5",
"grunt-contrib-jshint": "~0.11.1",
"grunt-mocha-test": "~0.12.7"
},
"readme": "# libqp\n\nEncode and decode quoted-printable strings according to [RFC2045](http://tools.ietf.org/html/rfc2045#section-6.7).\n\n## Usage\n\nInstall with npm\n\n npm install libqp\n\nRequire in your script\n\n```javascript\nvar libqp = require('libqp');\n```\n\n### Encode values\n\nEncode Buffer objects or unicode strings with\n\n libqp.encode(val) → String\n\nWhere\n\n * **val** is a Buffer or an unicode string\n\n**Example**\n\n```javascript\nlibqp.encode('jõgeva');\n// j=C3=B5geva\n```\n\n### Wrap encoded values\n\nQuoted-Printable encoded lines are limited to 76 characters but `encode` method might return lines longer than the limit.\n\nTo enforce soft line breaks on lines longer than 76 (or any other length) characters, use `wrap`\n\n libqp.wrap(str[, lineLength]) → String\n\nWhere\n\n * **str** is a Quoted-Printable encoded string\n * **lineLength** (defaults to 76) is the maximum allowed line length. Any longer line will be soft wrapped\n\n**Example**\n\n```javascript\nlibqp.wrap('abc j=C3=B5geva', 10)\n// abc j=\\r\\n\n// =C3=B5geva\n```\n\n### Transform Streams\n\n`libqp` makes it possible to encode and decode streams with `libqp.Encoder` and `libqp.Decoder` constructors.\n\n### Encoder Stream\n\nCreate new Encoder Stream with\n\n var encoder = new libqp.Encoder([options])\n\nWhere\n\n * **options** is the optional stream options object with an additional option `lineLength` if you want to use any other line length than the default 76 characters (or set to `false` to turn the soft wrapping off completely)\n\n**Example**\n\nThe following example script reads in a file, encodes it to Quoted-Printable and saves the output to a file.\n\n```javascript\nvar libqp = require('libqp');\nvar fs = require('fs');\nvar source = fs.createReadStream('source.txt');\nvar encoded = fs.createReadStream('encoded.txt');\nvar encoder = new libqp.Encoder();\n\nsource.pipe(encoder).pipe(encoded);\n```\n\n### Decoder Stream\n\nCreate new Decoder Stream with\n\n var decoder = new libqp.Decoder([options])\n\nWhere\n\n * **options** is the optional stream options object\n\n**Example**\n\nThe following example script reads in a file in Quoted-Printable encoding, decodes it and saves the output to a file.\n\n```javascript\nvar libqp = require('libqp');\nvar fs = require('fs');\nvar encoded = fs.createReadStream('encoded.txt');\nvar dest = fs.createReadStream('dest.txt');\nvar decoder = new libqp.Decoder();\n\nencoded.pipe(decoder).pipe(dest);\n```\n\n## License\n\n**MIT**",
"readmeFilename": "README.md",
"_id": "libqp@1.0.0",
"_shasum": "aded044d83970c152de5b983d39c3b2d291f9a74",
"_resolved": "https://registry.npmjs.org/libqp/-/libqp-1.0.0.tgz",
"_from": "libqp@>=1.0.0 <2.0.0"
}

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,902 @@
'use strict';
var libmime = require('libmime');
var libqp = require('libqp');
var libbase64 = require('libbase64');
var punycode = require('punycode');
var addressparser = require('addressparser');
var stream = require('stream');
var PassThrough = stream.PassThrough;
var fs = require('fs');
var hyperquest = require('hyperquest');
module.exports = MimeNode;
/**
* Creates a new mime tree node. Assumes 'multipart/*' as the content type
* if it is a branch, anything else counts as leaf. If rootNode is missing from
* the options, assumes this is the root.
*
* @param {String} contentType Define the content type for the node. Can be left blank for attachments (derived from filename)
* @param {Object} [options] optional options
* @param {Object} [options.rootNode] root node for this tree
* @param {Object} [options.parentNode] immediate parent for this node
* @param {Object} [options.filename] filename for an attachment node
* @param {String} [options.baseBoundary] shared part of the unique multipart boundary
* @param {Boolean} [options.keepBcc] If true, do not exclude Bcc from the generated headers
*/
function MimeNode(contentType, options) {
this.nodeCounter = 0;
options = options || {};
/**
* shared part of the unique multipart boundary
*/
this.baseBoundary = options.baseBoundary || Date.now().toString() + Math.random();
/**
* If date headers is missing and current node is the root, this value is used instead
*/
this.date = new Date();
/**
* Root node for current mime tree
*/
this.rootNode = options.rootNode || this;
/**
* If true include Bcc in generated headers (if available)
*/
this.keepBcc = !!options.keepBcc;
/**
* If filename is specified but contentType is not (probably an attachment)
* detect the content type from filename extension
*/
if (options.filename) {
/**
* Filename for this node. Useful with attachments
*/
this.filename = options.filename;
if (!contentType) {
contentType = libmime.detectMimeType(this.filename.split('.').pop());
}
}
/**
* Immediate parent for this node (or undefined if not set)
*/
this.parentNode = options.parentNode;
/**
* An array for possible child nodes
*/
this.childNodes = [];
/**
* Used for generating unique boundaries (prepended to the shared base)
*/
this._nodeId = ++this.rootNode.nodeCounter;
/**
* A list of header values for this node in the form of [{key:'', value:''}]
*/
this._headers = [];
/**
* True if the content only uses ASCII printable characters
* @type {Boolean}
*/
this._isPlainText = false;
/**
* True if the content is plain text but has longer lines than allowed
* @type {Boolean}
*/
this._canUseFlowedContent = false;
this._isFlowedContent = false;
/**
* If set, use instead this value for envelopes instead of generating one
* @type {Boolean}
*/
this._envelope = false;
/**
* Additional transform streams that the message will be piped before
* exposing by createReadStream
* @type {Array}
*/
this._transforms = [];
/**
* If content type is set (or derived from the filename) add it to headers
*/
if (contentType) {
this.setHeader('content-type', contentType);
}
}
/////// PUBLIC METHODS
/**
* Creates and appends a child node.Arguments provided are passed to MimeNode constructor
*
* @param {String} [contentType] Optional content type
* @param {Object} [options] Optional options object
* @return {Object} Created node object
*/
MimeNode.prototype.createChild = function(contentType, options) {
if (!options && typeof contentType === 'object') {
options = contentType;
contentType = undefined;
}
var node = new MimeNode(contentType, options);
this.appendChild(node);
return node;
};
/**
* Appends an existing node to the mime tree. Removes the node from an existing
* tree if needed
*
* @param {Object} childNode node to be appended
* @return {Object} Appended node object
*/
MimeNode.prototype.appendChild = function(childNode) {
if (childNode.rootNode !== this.rootNode) {
childNode.rootNode = this.rootNode;
childNode._nodeId = ++this.rootNode.nodeCounter;
}
childNode.parentNode = this;
this.childNodes.push(childNode);
return childNode;
};
/**
* Replaces current node with another node
*
* @param {Object} node Replacement node
* @return {Object} Replacement node
*/
MimeNode.prototype.replace = function(node) {
if (node === this) {
return this;
}
this.parentNode.childNodes.forEach(function(childNode, i) {
if (childNode === this) {
node.rootNode = this.rootNode;
node.parentNode = this.parentNode;
node._nodeId = this._nodeId;
this.rootNode = this;
this.parentNode = undefined;
node.parentNode.childNodes[i] = node;
}
}.bind(this));
return node;
};
/**
* Removes current node from the mime tree
*
* @return {Object} removed node
*/
MimeNode.prototype.remove = function() {
if (!this.parentNode) {
return this;
}
for (var i = this.parentNode.childNodes.length - 1; i >= 0; i--) {
if (this.parentNode.childNodes[i] === this) {
this.parentNode.childNodes.splice(i, 1);
this.parentNode = undefined;
this.rootNode = this;
return this;
}
}
};
/**
* Sets a header value. If the value for selected key exists, it is overwritten.
* You can set multiple values as well by using [{key:'', value:''}] or
* {key: 'value'} as the first argument.
*
* @param {String|Array|Object} key Header key or a list of key value pairs
* @param {String} value Header value
* @return {Object} current node
*/
MimeNode.prototype.setHeader = function(key, value) {
var added = false,
headerValue;
// Allow setting multiple headers at once
if (!value && key && typeof key === 'object') {
// allow {key:'content-type', value: 'text/plain'}
if (key.key && key.value) {
this.setHeader(key.key, key.value);
}
// allow [{key:'content-type', value: 'text/plain'}]
else if (Array.isArray(key)) {
key.forEach(function(i) {
this.setHeader(i.key, i.value);
}.bind(this));
}
// allow {'content-type': 'text/plain'}
else {
Object.keys(key).forEach(function(i) {
this.setHeader(i, key[i]);
}.bind(this));
}
return this;
}
key = this._normalizeHeaderKey(key);
headerValue = {
key: key,
value: value
};
// Check if the value exists and overwrite
for (var i = 0, len = this._headers.length; i < len; i++) {
if (this._headers[i].key === key) {
if (!added) {
// replace the first match
this._headers[i] = headerValue;
added = true;
} else {
// remove following matches
this._headers.splice(i, 1);
i--;
len--;
}
}
}
// match not found, append the value
if (!added) {
this._headers.push(headerValue);
}
return this;
};
/**
* Adds a header value. If the value for selected key exists, the value is appended
* as a new field and old one is not touched.
* You can set multiple values as well by using [{key:'', value:''}] or
* {key: 'value'} as the first argument.
*
* @param {String|Array|Object} key Header key or a list of key value pairs
* @param {String} value Header value
* @return {Object} current node
*/
MimeNode.prototype.addHeader = function(key, value) {
// Allow setting multiple headers at once
if (!value && key && typeof key === 'object') {
// allow {key:'content-type', value: 'text/plain'}
if (key.key && key.value) {
this.addHeader(key.key, key.value);
}
// allow [{key:'content-type', value: 'text/plain'}]
else if (Array.isArray(key)) {
key.forEach(function(i) {
this.addHeader(i.key, i.value);
}.bind(this));
}
// allow {'content-type': 'text/plain'}
else {
Object.keys(key).forEach(function(i) {
this.addHeader(i, key[i]);
}.bind(this));
}
return this;
}
this._headers.push({
key: this._normalizeHeaderKey(key),
value: value
});
return this;
};
/**
* Retrieves the first mathcing value of a selected key
*
* @param {String} key Key to search for
* @retun {String} Value for the key
*/
MimeNode.prototype.getHeader = function(key) {
key = this._normalizeHeaderKey(key);
for (var i = 0, len = this._headers.length; i < len; i++) {
if (this._headers[i].key === key) {
return this._headers[i].value;
}
}
};
/**
* Sets body content for current node. If the value is a string, charset is added automatically
* to Content-Type (if it is text/*). If the value is a Buffer, you need to specify
* the charset yourself
*
* @param (String|Buffer) content Body content
* @return {Object} current node
*/
MimeNode.prototype.setContent = function(content) {
var _self = this;
this.content = content;
if (typeof this.content.pipe === 'function') {
this._contentErrorHandler = function(err) {
_self.content.removeListener('error', _self._contentErrorHandler);
_self.content = '<' + err.message + '>';
};
this.content.once('error', this._contentErrorHandler);
} else if (typeof this.content === 'string') {
this._isPlainText = libmime.isPlainText(this.content);
if (this._isPlainText && libmime.hasLongerLines(this.content, 76)) {
// If there are lines longer than 76 symbols/bytes, use 'format=flowed' for text nodes
this._canUseFlowedContent = true;
}
}
return this;
};
MimeNode.prototype.build = function(callback) {
var stream = this.createReadStream();
var buf = [];
var buflen = 0;
stream.on('data', function(chunk) {
if (chunk && chunk.length) {
buf.push(chunk);
buflen += chunk.length;
}
});
stream.once('end', function(chunk) {
if (chunk && chunk.length) {
buf.push(chunk);
buflen += chunk.length;
}
return callback(null, Buffer.concat(buf, buflen));
});
};
MimeNode.prototype.getTransferEncoding = function() {
var transferEncoding = false;
var contentType = (this.getHeader('Content-Type') || '').toString().toLowerCase().trim();
if (this.content) {
transferEncoding = (this.getHeader('Content-Transfer-Encoding') || '').toString().toLowerCase().trim();
if (!transferEncoding || ['base64', 'quoted-printable'].indexOf(transferEncoding) < 0) {
if (/^text\//i.test(contentType)) {
// If there are no special symbols, no need to modify the text
if (this._isPlainText) {
transferEncoding = '7bit';
} else {
transferEncoding = 'quoted-printable';
}
} else if (!/^multipart\//i.test(contentType)) {
transferEncoding = transferEncoding || 'base64';
}
}
}
return transferEncoding;
};
/**
* Builds the header block for the mime node. Append \r\n\r\n before writing the content
*
* @returns {String} Headers
*/
MimeNode.prototype.buildHeaders = function() {
var _self = this;
var transferEncoding = this.getTransferEncoding();
var headers = [];
if (transferEncoding) {
this.setHeader('Content-Transfer-Encoding', transferEncoding);
}
if (this.filename && !this.getHeader('Content-Disposition')) {
this.setHeader('Content-Disposition', 'attachment');
}
// Ensure mandatory header fields
if (this.rootNode === this) {
if (!this.getHeader('Date')) {
this.setHeader('Date', this.date.toUTCString().replace(/GMT/, '+0000'));
}
// You really should define your own Message-Id field!
if (!this.getHeader('Message-Id')) {
this.setHeader('Message-Id', '<' +
// crux to generate random strings like this:
// "1401391905590-58aa8c32-d32a065c-c1a2aad2"
[0, 0, 0].reduce(function(prev) {
return prev + '-' + Math.floor((1 + Math.random()) * 0x100000000).
toString(16).
substring(1);
}, Date.now()) +
'@' +
// try to use the domain of the FROM address or fallback localhost
(this.getEnvelope().from || 'localhost').split('@').pop() +
'>');
}
if (!this.getHeader('MIME-Version')) {
this.setHeader('MIME-Version', '1.0');
}
}
this._headers.forEach(function(header) {
var key = header.key,
value = header.value,
structured;
switch (header.key) {
case 'Content-Disposition':
structured = libmime.parseHeaderValue(value);
if (_self.filename) {
structured.params.filename = _self.filename;
}
value = libmime.buildHeaderValue(structured);
break;
case 'Content-Type':
structured = libmime.parseHeaderValue(value);
_self._handleContentType(structured);
if (structured.value.match(/^text\/plain\b/) && typeof _self.content === 'string') {
if (_self._canUseFlowedContent) {
structured.params.format = 'flowed';
}
if (/[\u0080-\uFFFF]/.test(_self.content)) {
structured.params.charset = 'utf-8';
}
}
_self._isFlowedContent = String(structured.params.format).toLowerCase().trim() === 'flowed';
value = libmime.buildHeaderValue(structured);
break;
case 'Bcc':
if (!_self.keepBcc) {
// skip BCC values
return;
}
break;
}
// skip empty lines
value = _self._encodeHeaderValue(key, value);
if (!(value || '').toString().trim()) {
return;
}
headers.push(libmime.foldLines(key + ': ' + value, 76));
});
return headers.join('\r\n');
};
/**
* Streams the rfc2822 message from the current node. If this is a root node,
* mandatory header fields are set if missing (Date, Message-Id, MIME-Version)
*
* @return {String} Compiled message
*/
MimeNode.prototype.createReadStream = function(options) {
options = options || {};
var outputStream = new PassThrough(options);
this.stream(outputStream, options, function() {
outputStream.end();
});
for (var i = 0, len = this._transforms.length; i < len; i++) {
outputStream = outputStream.pipe(typeof this._transforms[i] === 'function' ? this._transforms[i]() : this._transforms[i]);
}
return outputStream;
};
/**
* Appends a transform stream object to the transforms list. Final output
* is passed through this stream before exposing
*
* @param {Object} transform Read-Write stream
*/
MimeNode.prototype.transform = function(transform) {
this._transforms.push(transform);
};
MimeNode.prototype.stream = function(outputStream, options, callback) {
var _self = this;
var transferEncoding = this.getTransferEncoding();
var contentStream;
var localStream;
// pushes node content
function sendContent() {
if (_self.content) {
if (typeof _self.content.pipe === 'function') {
_self.content.removeListener('error', _self._contentErrorHandler);
_self._contentErrorHandler = function(err) {
if (contentStream) {
contentStream.write('<' + err.message + '>');
contentStream.end();
} else {
outputStream.write('<' + err.message + '>');
setImmediate(finalize);
}
};
_self.content.once('error', _self._contentErrorHandler);
}
if (['quoted-printable', 'base64'].indexOf(transferEncoding) >= 0) {
contentStream = new(transferEncoding === 'base64' ? libbase64 : libqp).Encoder(options);
contentStream.pipe(outputStream, {
end: false
});
contentStream.once('end', finalize);
localStream = _self._getStream(_self.content);
// using `on` instead of `once` because hyperquest 0.3.0 seems to
// throw when `once('error')` is used and an error occurs
localStream.on('error', function(err) {
contentStream.end('[' + err.message + ']');
});
localStream.pipe(contentStream);
return;
} else {
if (_self._isFlowedContent) {
localStream = _self._getStream(libmime.encodeFlowed(_self.content));
} else {
localStream = _self._getStream(_self.content);
}
localStream.pipe(outputStream, {
end: false
});
localStream.once('end', finalize);
// using `on` instead of `once` because hyperquest 0.3.0 seems to
// throw when `once('error')` is used and an error occurs
localStream.on('error', function(err) {
localStream.write('[' + err.message + ']');
finalize();
});
return;
}
} else {
return setImmediate(finalize);
}
}
// for multipart nodes, push child nodes
// for content nodes end the stream
function finalize() {
var childId = 0;
var processChildNode = function() {
if (childId >= _self.childNodes.length) {
outputStream.write('\r\n--' + _self.boundary + '--\r\n');
return callback();
}
var child = _self.childNodes[childId++];
outputStream.write((childId > 1 ? '\r\n' : '') + '--' + _self.boundary + '\r\n');
child.stream(outputStream, options, function() {
setImmediate(processChildNode);
});
};
if (_self.multipart) {
setImmediate(processChildNode);
} else {
return callback();
}
}
outputStream.write(this.buildHeaders() + '\r\n\r\n');
setImmediate(sendContent);
};
/**
* Sets envelope to be used instead of the generated one
*
* @return {Object} SMTP envelope in the form of {from: 'from@example.com', to: ['to@example.com']}
*/
MimeNode.prototype.setEnvelope = function(envelope) {
this._envelope = {
from: envelope.from || false,
to: [].concat(envelope.to || [])
};
return this;
};
/**
* Generates and returns an object with parsed address fields
*
* @return {Object} Address object
*/
MimeNode.prototype.getAddresses = function() {
var addresses = {};
this._headers.forEach(function(header) {
var key = header.key.toLowerCase();
if (['from', 'sender', 'reply-to', 'to', 'cc', 'bcc'].indexOf(key) >= 0) {
if (!Array.isArray(addresses[key])) {
addresses[key] = [];
}
this._convertAddresses(this._parseAddresses(header.value), addresses[key]);
}
}.bind(this));
return addresses;
};
/**
* Generates and returns SMTP envelope with the sender address and a list of recipients addresses
*
* @return {Object} SMTP envelope in the form of {from: 'from@example.com', to: ['to@example.com']}
*/
MimeNode.prototype.getEnvelope = function() {
if (this._envelope) {
return this._envelope;
}
var envelope = {
from: false,
to: []
};
this._headers.forEach(function(header) {
var list = [];
if (header.key === 'From' || (!envelope.from && ['Reply-To', 'Sender'].indexOf(header.key) >= 0)) {
this._convertAddresses(this._parseAddresses(header.value), list);
if (list.length && list[0]) {
envelope.from = list[0].address;
}
} else if (['To', 'Cc', 'Bcc'].indexOf(header.key) >= 0) {
this._convertAddresses(this._parseAddresses(header.value), envelope.to);
}
}.bind(this));
envelope.to = envelope.to.map(function(to) {
return to.address;
});
return envelope;
};
/////// PRIVATE METHODS
/**
* Detects and returns handle to a stream related with the content.
*
* @param {Mixed} content Node content
* @returns {Object} Stream object
*/
MimeNode.prototype._getStream = function(content) {
var contentStream;
if (typeof content.pipe === 'function') {
return content;
} else if (content && typeof content.path === 'string' && !content.href) {
return fs.createReadStream(content.path);
} else if (content && typeof content.href === 'string') {
return hyperquest(content.href);
} else {
contentStream = new PassThrough();
contentStream.end(content || '');
return contentStream;
}
};
/**
* Parses addresses. Takes in a single address or an array or an
* array of address arrays (eg. To: [[first group], [second group],...])
*
* @param {Mixed} addresses Addresses to be parsed
* @return {Array} An array of address objects
*/
MimeNode.prototype._parseAddresses = function(addresses) {
return [].concat.apply([], [].concat(addresses).map(function(address) {
if (address && address.address) {
address = this._convertAddresses(address);
}
return addressparser(address);
}.bind(this)));
};
/**
* Normalizes a header key, uses Camel-Case form, except for uppercase MIME-
*
* @param {String} key Key to be normalized
* @return {String} key in Camel-Case form
*/
MimeNode.prototype._normalizeHeaderKey = function(key) {
return (key || '').toString().
// no newlines in keys
replace(/\r?\n|\r/g, ' ').
trim().toLowerCase().
// use uppercase words, except MIME
replace(/^MIME\b|^[a-z]|\-[a-z]/ig, function(c) {
return c.toUpperCase();
});
};
/**
* Checks if the content type is multipart and defines boundary if needed.
* Doesn't return anything, modifies object argument instead.
*
* @param {Object} structured Parsed header value for 'Content-Type' key
*/
MimeNode.prototype._handleContentType = function(structured) {
this.contentType = structured.value.trim().toLowerCase();
this.multipart = this.contentType.split('/').reduce(function(prev, value) {
return prev === 'multipart' ? value : false;
});
if (this.multipart) {
this.boundary = structured.params.boundary = structured.params.boundary || this.boundary || this._generateBoundary();
} else {
this.boundary = false;
}
};
/**
* Generates a multipart boundary value
*
* @return {String} boundary value
*/
MimeNode.prototype._generateBoundary = function() {
return '----sinikael-?=_' + this._nodeId + '-' + this.rootNode.baseBoundary;
};
/**
* Encodes a header value for use in the generated rfc2822 email.
*
* @param {String} key Header key
* @param {String} value Header value
*/
MimeNode.prototype._encodeHeaderValue = function(key, value) {
key = this._normalizeHeaderKey(key);
switch (key) {
// Structured headers
case 'From':
case 'Sender':
case 'To':
case 'Cc':
case 'Bcc':
case 'Reply-To':
return this._convertAddresses(this._parseAddresses(value));
// values enclosed in <>
case 'Message-Id':
case 'In-Reply-To':
case 'Content-Id':
value = (value || '').toString().replace(/\r?\n|\r/g, ' ');
if (value.charAt(0) !== '<') {
value = '<' + value;
}
if (value.charAt(value.length - 1) !== '>') {
value = value + '>';
}
return value;
// space separated list of values enclosed in <>
case 'References':
value = [].concat.apply([], [].concat(value || '').map(function(elm) {
elm = (elm || '').toString().replace(/\r?\n|\r/g, ' ').trim();
return elm.replace(/<[^>]*>/g, function(str) {
return str.replace(/\s/g, '');
}).split(/\s+/);
})).map(function(elm) {
if (elm.charAt(0) !== '<') {
elm = '<' + elm;
}
if (elm.charAt(elm.length - 1) !== '>') {
elm = elm + '>';
}
return elm;
});
return value.join(' ').trim();
case 'Date':
if (Object.prototype.toString.call(value) === '[object Date]') {
return value.toUTCString().replace(/GMT/, '+0000');
}
value = (value || '').toString().replace(/\r?\n|\r/g, ' ');
return libmime.encodeWords(value, 'Q', 52);
default:
value = (value || '').toString().replace(/\r?\n|\r/g, ' ');
// encodeWords only encodes if needed, otherwise the original string is returned
return libmime.encodeWords(value, 'Q', 52);
}
};
/**
* Rebuilds address object using punycode and other adjustments
*
* @param {Array} addresses An array of address objects
* @param {Array} [uniqueList] An array to be populated with addresses
* @return {String} address string
*/
MimeNode.prototype._convertAddresses = function(addresses, uniqueList) {
var values = [];
uniqueList = uniqueList || [];
[].concat(addresses || []).forEach(function(address) {
if (address.address) {
address.address = address.address.replace(/^.*?(?=\@)/, function(user) {
// pretty bad solution but what you gonna do
// unicode usernames are converted to encoded words
// 'jõgeva@hot.ee' will be converted to '=?utf-8?Q?j=C3=B5geva?=@hot.ee'
return libmime.encodeWords(user, 'Q', 52);
}).replace(/@.+$/, function(domain) {
// domains are punycoded by default
// 'jõgeva.ee' will be converted to 'xn--jgeva-dua.ee'
// non-unicode domains are left as is
return '@' + punycode.toASCII(domain.substr(1));
});
if (!address.name) {
values.push(address.address);
} else if (address.name) {
values.push(this._encodeAddressName(address.name) + ' <' + address.address + '>');
}
if (address.address) {
if (!uniqueList.filter(function(a) {
return a.address === address.address;
}).length) {
uniqueList.push(address);
}
}
} else if (address.group) {
values.push(this._encodeAddressName(address.name) + ':' + (address.group.length ? this._convertAddresses(address.group, uniqueList) : '').trim() + ';');
}
}.bind(this));
return values.join(', ');
};
/**
* If needed, mime encodes the name part
*
* @param {String} name Name part of an address
* @returns {String} Mime word encoded string if needed
*/
MimeNode.prototype._encodeAddressName = function(name) {
if (!/^[\w ']*$/.test(name)) {
if (/^[\x20-\x7e]*$/.test(name)) {
return '"' + name.replace(/([\\"])/g, '\\$1') + '"';
} else {
return libmime.encodeWord(name, 'Q', 52);
}
}
return name;
};

View File

@@ -0,0 +1,955 @@
'use strict';
var chai = require('chai');
var sinon = require('sinon');
var Buildmail = require('../src/buildmail');
var http = require('http');
var stream = require('stream');
var Transform = stream.Transform;
var expect = chai.expect;
chai.Assertion.includeStack = true;
describe('Buildmail', function() {
it('should create Buildmail object', function() {
expect(new Buildmail()).to.exist;
});
describe('#createChild', function() {
it('should create child', function() {
var mb = new Buildmail('multipart/mixed');
var child = mb.createChild('multipart/mixed');
expect(child.parentNode).to.equal(mb);
expect(child.rootNode).to.equal(mb);
var subchild1 = child.createChild('text/html');
expect(subchild1.parentNode).to.equal(child);
expect(subchild1.rootNode).to.equal(mb);
var subchild2 = child.createChild('text/html');
expect(subchild2.parentNode).to.equal(child);
expect(subchild2.rootNode).to.equal(mb);
});
});
describe('#appendChild', function() {
it('should append child node', function() {
var mb = new Buildmail('multipart/mixed');
var child = new Buildmail('text/plain');
mb.appendChild(child);
expect(child.parentNode).to.equal(mb);
expect(child.rootNode).to.equal(mb);
expect(mb.childNodes.length).to.equal(1);
expect(mb.childNodes[0]).to.equal(child);
});
});
describe('#replace', function() {
it('should replace node', function() {
var mb = new Buildmail(),
child = mb.createChild('text/plain'),
replacement = new Buildmail('image/png');
child.replace(replacement);
expect(mb.childNodes.length).to.equal(1);
expect(mb.childNodes[0]).to.equal(replacement);
});
});
describe('#remove', function() {
it('should remove node', function() {
var mb = new Buildmail(),
child = mb.createChild('text/plain');
child.remove();
expect(mb.childNodes.length).to.equal(0);
expect(child.parenNode).to.not.exist;
});
});
describe('#setHeader', function() {
it('should set header', function() {
var mb = new Buildmail();
mb.setHeader('key', 'value');
mb.setHeader('key', 'value1');
expect(mb.getHeader('Key')).to.equal('value1');
mb.setHeader([{
key: 'key',
value: 'value2'
}, {
key: 'key2',
value: 'value3'
}]);
expect(mb._headers).to.deep.equal([{
key: 'Key',
value: 'value2'
}, {
key: 'Key2',
value: 'value3'
}]);
mb.setHeader({
key: 'value4',
key2: 'value5'
});
expect(mb._headers).to.deep.equal([{
key: 'Key',
value: 'value4'
}, {
key: 'Key2',
value: 'value5'
}]);
});
});
describe('#addHeader', function() {
it('should add header', function() {
var mb = new Buildmail();
mb.addHeader('key', 'value1');
mb.addHeader('key', 'value2');
mb.addHeader([{
key: 'key',
value: 'value2'
}, {
key: 'key2',
value: 'value3'
}]);
mb.addHeader({
key: 'value4',
key2: 'value5'
});
expect(mb._headers).to.deep.equal([{
key: 'Key',
value: 'value1'
}, {
key: 'Key',
value: 'value2'
}, {
key: 'Key',
value: 'value2'
}, {
key: 'Key2',
value: 'value3'
}, {
key: 'Key',
value: 'value4'
}, {
key: 'Key2',
value: 'value5'
}]);
});
});
describe('#getHeader', function() {
it('should return first matching header value', function() {
var mb = new Buildmail();
mb._headers = [{
key: 'Key',
value: 'value4'
}, {
key: 'Key2',
value: 'value5'
}];
expect(mb.getHeader('KEY')).to.equal('value4');
});
});
describe('#setContent', function() {
it('should set the contents for a node', function() {
var mb = new Buildmail();
mb.setContent('abc');
expect(mb.content).to.equal('abc');
});
});
describe('#build', function() {
it('should build root node', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
date: '12345',
'message-id': '67890'
}).
setContent('Hello world!'),
expected = 'Content-Type: text/plain\r\n' +
'Date: 12345\r\n' +
'Message-Id: <67890>\r\n' +
'Content-Transfer-Encoding: 7bit\r\n' +
'MIME-Version: 1.0\r\n' +
'\r\n' +
'Hello world!';
mb.build(function(err, msg) {
msg = msg.toString();
expect(msg).to.equal(expected);
done();
});
});
it('should build child node', function(done) {
var mb = new Buildmail('multipart/mixed'),
childNode = mb.createChild('text/plain').
setContent('Hello world!'),
expected = 'Content-Type: text/plain\r\n' +
'Content-Transfer-Encoding: 7bit\r\n' +
'\r\n' +
'Hello world!';
childNode.build(function(err, msg) {
msg = msg.toString();
expect(msg).to.equal(expected);
done();
});
});
it('should build multipart node', function(done) {
var mb = new Buildmail('multipart/mixed', {
baseBoundary: 'test'
}).
setHeader({
date: '12345',
'message-id': '67890'
}),
expected = 'Content-Type: multipart/mixed; boundary="----sinikael-?=_1-test"\r\n' +
'Date: 12345\r\n' +
'Message-Id: <67890>\r\n' +
'MIME-Version: 1.0\r\n' +
'\r\n' +
'------sinikael-?=_1-test\r\n' +
'Content-Type: text/plain\r\n' +
'Content-Transfer-Encoding: 7bit\r\n' +
'\r\n' +
'Hello world!\r\n' +
'------sinikael-?=_1-test--\r\n';
mb.createChild('text/plain').setContent('Hello world!');
mb.build(function(err, msg) {
msg = msg.toString();
expect(msg).to.equal(expected);
done();
});
});
it('should build root with generated headers', function(done) {
var mb = new Buildmail('text/plain');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Date:\s/m.test(msg)).to.be.true;
expect(/^Message\-Id:\s</m.test(msg)).to.be.true;
expect(/^MIME-Version: 1.0$/m.test(msg)).to.be.true;
done();
});
});
it('should not include bcc missing in output, but in envelope', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
from: 'sender@example.com',
to: 'receiver@example.com',
bcc: 'bcc@example.com'
}),
envelope = mb.getEnvelope();
expect(envelope).to.deep.equal({
from: 'sender@example.com',
to: ['receiver@example.com', 'bcc@example.com']
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^From: sender@example.com$/m.test(msg)).to.be.true;
expect(/^To: receiver@example.com$/m.test(msg)).to.be.true;
expect(!/^Bcc:/m.test(msg)).to.be.true;
done();
});
});
it('should include bcc missing in output and in envelope', function(done) {
var mb = new Buildmail('text/plain', {
keepBcc: true
}).
setHeader({
from: 'sender@example.com',
to: 'receiver@example.com',
bcc: 'bcc@example.com'
}),
envelope = mb.getEnvelope();
expect(envelope).to.deep.equal({
from: 'sender@example.com',
to: ['receiver@example.com', 'bcc@example.com']
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^From: sender@example.com$/m.test(msg)).to.be.true;
expect(/^To: receiver@example.com$/m.test(msg)).to.be.true;
expect(/^Bcc: bcc@example.com$/m.test(msg)).to.be.true;
done();
});
});
it('should use set envelope', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
from: 'sender@example.com',
to: 'receiver@example.com',
bcc: 'bcc@example.com'
}).setEnvelope({
from: 'a',
to: 'b'
}),
envelope = mb.getEnvelope();
expect(envelope).to.deep.equal({
from: 'a',
to: ['b']
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^From: sender@example.com$/m.test(msg)).to.be.true;
expect(/^To: receiver@example.com$/m.test(msg)).to.be.true;
expect(!/^Bcc:/m.test(msg)).to.be.true;
done();
});
});
it('should have unicode subject', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
subject: 'jõgeval istus kägu metsas'
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Subject: =\?UTF-8\?Q\?j=C3=B5geval\?= istus =\?UTF-8\?Q\?k=C3=A4gu\?= metsas$/m.test(msg)).to.be.true;
done();
});
});
it('should have unicode subject with strange characters', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
subject: 'ˆ¸ÁÌÓıÏˇÁÛ^¸\\ÁıˆÌÁÛØ^\\˜Û˝™ˇıÓ¸^\\˜fi^\\·\\˜Ø^£˜#fi^\\£fi^\\£fi^\\'
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(msg.match(/\bSubject: [^\r]*\r\n( [^\r]*\r\n)*/)[0]).to.equal('Subject: =?UTF-8?Q?=CB=86=C2=B8=C3=81=C3=8C=C3=93=C4=B1?=\r\n =?UTF-8?Q?=C3=8F=CB=87=C3=81=C3=9B^=C2=B8\\=C3=81?=\r\n =?UTF-8?Q?=C4=B1=CB=86=C3=8C=C3=81=C3=9B=C3=98^\\?=\r\n =?UTF-8?Q?=CB=9C=C3=9B=CB=9D=E2=84=A2=CB=87=C4=B1?=\r\n =?UTF-8?Q?=C3=93=C2=B8^\\=CB=9C=EF=AC=81^\\=C2=B7\\?=\r\n =?UTF-8?Q?=CB=9C=C3=98^=C2=A3=CB=9C#=EF=AC=81^\\?=\r\n =?UTF-8?Q?=C2=A3=EF=AC=81^\\=C2=A3=EF=AC=81^\\?=\r\n');
done();
});
});
it('should keep 7bit text as is', function(done) {
var mb = new Buildmail('text/plain').
setContent('tere tere');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/\r\n\r\ntere tere$/.test(msg)).to.be.true;
expect(/^Content-Type: text\/plain$/m.test(msg)).to.be.true;
expect(/^Content-Transfer-Encoding: 7bit$/m.test(msg)).to.be.true;
done();
});
});
it('should stuff flowed space', function(done) {
var mb = new Buildmail('text/plain; format=flowed').
setContent('tere\r\nFrom\r\n Hello\r\n> abc\r\nabc');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Content-Type: text\/plain; format=flowed$/m.test(msg)).to.be.true;
expect(/^Content-Transfer-Encoding: 7bit$/m.test(msg)).to.be.true;
msg = msg.split('\r\n\r\n');
msg.shift();
msg = msg.join('\r\n\r\n');
expect(msg).to.equal('tere\r\n From\r\n Hello\r\n > abc\r\nabc');
done();
});
});
it('should fetch ascii filename', function(done) {
var mb = new Buildmail('text/plain', {
filename: 'jogeva.txt'
}).
setContent('jogeva');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/\r\n\r\njogeva$/.test(msg)).to.be.true;
expect(/^Content-Type: text\/plain$/m.test(msg)).to.be.true;
expect(/^Content-Transfer-Encoding: 7bit$/m.test(msg)).to.be.true;
expect(/^Content-Disposition: attachment; filename=jogeva.txt$/m.test(msg)).to.be.true;
done();
});
});
it('should set unicode filename', function(done) {
var mb = new Buildmail('text/plain', {
filename: 'jõgeva.txt'
}).
setContent('jõgeva');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Content-Type: text\/plain; charset=utf-8$/m.test(msg)).to.be.true;
expect(/^Content-Transfer-Encoding: quoted-printable$/m.test(msg)).to.be.true;
expect(/^Content-Disposition: attachment; filename\*0\*=utf-8''j%C3%B5geva.txt$/m.test(msg)).to.be.true;
done();
});
});
it('should encode filename with a space', function(done) {
var mb = new Buildmail('text/plain', {
filename: 'document a.test.pdf'
}).
setContent('jõgeva');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Content-Type: text\/plain; charset=utf-8$/m.test(msg)).to.be.true;
expect(/^Content-Transfer-Encoding: quoted-printable$/m.test(msg)).to.be.true;
expect(/^Content-Disposition: attachment; filename="document a.test.pdf"$/m.test(msg)).to.be.true;
done();
});
});
it('should detect content type from filename', function(done) {
var mb = new Buildmail(false, {
filename: 'jogeva.zip'
}).
setContent('jogeva');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Content-Type: application\/zip$/m.test(msg)).to.be.true;
done();
});
});
it('should convert address objects', function(done) {
var mb = new Buildmail(false).
setHeader({
from: [{
name: 'the safewithme testuser',
address: 'safewithme.testuser@jõgeva.com'
}],
cc: [{
name: 'the safewithme testuser',
address: 'safewithme.testuser@jõgeva.com'
}]
});
expect(mb.getEnvelope()).to.deep.equal({
from: 'safewithme.testuser@xn--jgeva-dua.com',
to: [
'safewithme.testuser@xn--jgeva-dua.com'
]
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^From: the safewithme testuser <safewithme.testuser@xn\-\-jgeva-dua.com>$/m.test(msg)).to.be.true;
expect(/^Cc: the safewithme testuser <safewithme.testuser@xn\-\-jgeva-dua.com>$/m.test(msg)).to.be.true;
done();
});
});
it('should skip empty header', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
a: 'b',
cc: '',
dd: [],
o: false,
date: 'zzz',
'message-id': '67890'
}).
setContent('Hello world!'),
expected = 'Content-Type: text/plain\r\n' +
'A: b\r\n' +
'Date: zzz\r\n' +
'Message-Id: <67890>\r\n' +
'Content-Transfer-Encoding: 7bit\r\n' +
'MIME-Version: 1.0\r\n' +
'\r\n' +
'Hello world!';
mb.build(function(err, msg) {
msg = msg.toString();
expect(msg).to.equal(expected);
done();
});
});
it('should set default transfer encoding for application content', function(done) {
var mb = new Buildmail('application/x-my-stuff').
setHeader({
date: '12345',
'message-id': '67890'
}).
setContent('Hello world!'),
expected = 'Content-Type: application/x-my-stuff\r\n' +
'Date: 12345\r\n' +
'Message-Id: <67890>\r\n' +
'Content-Transfer-Encoding: base64\r\n' +
'MIME-Version: 1.0\r\n' +
'\r\n' +
'SGVsbG8gd29ybGQh';
mb.build(function(err, msg) {
msg = msg.toString();
expect(msg).to.equal(expected);
done();
});
});
it('should not set transfer encoding for multipart content', function(done) {
var mb = new Buildmail('multipart/global').
setHeader({
date: '12345',
'message-id': '67890'
}).
setContent('Hello world!'),
expected = 'Content-Type: multipart/global; boundary=abc\r\n' +
'Date: 12345\r\n' +
'Message-Id: <67890>\r\n' +
'MIME-Version: 1.0\r\n' +
'\r\n' +
'Hello world!\r\n' +
'--abc--' +
'\r\n';
mb.boundary = 'abc';
mb.build(function(err, msg) {
msg = msg.toString();
expect(msg).to.equal(expected);
done();
});
});
it('should use from domain for message-id', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
from: 'test@example.com'
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Message-Id: <\d+(\-[a-f0-9]{8}){3}@example\.com>$/m.test(msg)).to.be.true;
done();
});
});
it('should fallback to localhost for message-id', function(done) {
var mb = new Buildmail('text/plain');
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^Message-Id: <\d+(\-[a-f0-9]{8}){3}@localhost>$/m.test(msg)).to.be.true;
done();
});
});
});
describe('#getEnvelope', function() {
it('should get envelope', function() {
expect(new Buildmail().addHeader({
from: 'From <from@example.com>',
sender: 'Sender <sender@example.com>',
to: 'receiver1@example.com'
}).addHeader({
to: 'receiver2@example.com',
cc: 'receiver1@example.com, receiver3@example.com',
bcc: 'receiver4@example.com, Rec5 <receiver5@example.com>'
}).getEnvelope()).to.deep.equal({
from: 'from@example.com',
to: ['receiver1@example.com', 'receiver2@example.com', 'receiver3@example.com', 'receiver4@example.com', 'receiver5@example.com']
});
expect(new Buildmail().addHeader({
sender: 'Sender <sender@example.com>',
to: 'receiver1@example.com'
}).addHeader({
to: 'receiver2@example.com',
cc: 'receiver1@example.com, receiver3@example.com',
bcc: 'receiver4@example.com, Rec5 <receiver5@example.com>'
}).getEnvelope()).to.deep.equal({
from: 'sender@example.com',
to: ['receiver1@example.com', 'receiver2@example.com', 'receiver3@example.com', 'receiver4@example.com', 'receiver5@example.com']
});
});
});
describe('#getAddresses', function() {
it('should get address object', function() {
expect(new Buildmail().addHeader({
from: 'From <from@example.com>',
sender: 'Sender <sender@example.com>',
to: 'receiver1@example.com'
}).addHeader({
to: 'receiver2@example.com',
cc: 'receiver1@example.com, receiver3@example.com',
bcc: 'receiver4@example.com, Rec5 <receiver5@example.com>'
}).getAddresses()).to.deep.equal({
from: [{
address: 'from@example.com',
name: 'From'
}],
sender: [{
address: 'sender@example.com',
name: 'Sender'
}],
to: [{
address: 'receiver1@example.com',
name: ''
}, {
address: 'receiver2@example.com',
name: ''
}],
cc: [{
address: 'receiver1@example.com',
name: ''
}, {
address: 'receiver3@example.com',
name: ''
}],
bcc: [{
address: 'receiver4@example.com',
name: ''
}, {
address: 'receiver5@example.com',
name: 'Rec5'
}]
});
expect(new Buildmail().addHeader({
sender: 'Sender <sender@example.com>',
to: 'receiver1@example.com'
}).addHeader({
to: 'receiver2@example.com',
cc: 'receiver1@example.com, receiver1@example.com',
bcc: 'receiver4@example.com, Rec5 <receiver5@example.com>'
}).getAddresses()).to.deep.equal({
sender: [{
address: 'sender@example.com',
name: 'Sender'
}],
to: [{
address: 'receiver1@example.com',
name: ''
}, {
address: 'receiver2@example.com',
name: ''
}],
cc: [{
address: 'receiver1@example.com',
name: ''
}],
bcc: [{
address: 'receiver4@example.com',
name: ''
}, {
address: 'receiver5@example.com',
name: 'Rec5'
}]
});
});
});
describe('#_parseAddresses', function() {
it('should normalize header key', function() {
var mb = new Buildmail();
expect(mb._parseAddresses('test address@example.com')).to.deep.equal([{
address: 'address@example.com',
name: 'test'
}]);
expect(mb._parseAddresses(['test address@example.com'])).to.deep.equal([{
address: 'address@example.com',
name: 'test'
}]);
expect(mb._parseAddresses([
['test address@example.com']
])).to.deep.equal([{
address: 'address@example.com',
name: 'test'
}]);
expect(mb._parseAddresses([{
address: 'address@example.com',
name: 'test'
}])).to.deep.equal([{
address: 'address@example.com',
name: 'test'
}]);
});
});
describe('#_normalizeHeaderKey', function() {
it('should normalize header key', function() {
var mb = new Buildmail();
expect(mb._normalizeHeaderKey('key')).to.equal('Key');
expect(mb._normalizeHeaderKey('mime-vERSION')).to.equal('MIME-Version');
expect(mb._normalizeHeaderKey('-a-long-name')).to.equal('-A-Long-Name');
});
});
describe('#_handleContentType', function() {
it('should do nothing on non multipart', function() {
var mb = new Buildmail();
expect(mb.boundary).to.not.exist;
mb._handleContentType({
value: 'text/plain'
});
expect(mb.boundary).to.be.false;
expect(mb.multipart).to.be.false;
});
it('should use provided boundary', function() {
var mb = new Buildmail();
expect(mb.boundary).to.not.exist;
mb._handleContentType({
value: 'multipart/mixed',
params: {
boundary: 'abc'
}
});
expect(mb.boundary).to.equal('abc');
expect(mb.multipart).to.equal('mixed');
});
it('should generate boundary', function() {
var mb = new Buildmail();
sinon.stub(mb, '_generateBoundary').returns('def');
expect(mb.boundary).to.not.exist;
mb._handleContentType({
value: 'multipart/mixed',
params: {}
});
expect(mb.boundary).to.equal('def');
expect(mb.multipart).to.equal('mixed');
mb._generateBoundary.restore();
});
});
describe('#_generateBoundary ', function() {
it('should genereate boundary string', function() {
var mb = new Buildmail();
mb._nodeId = 'abc';
mb.rootNode.baseBoundary = 'def';
expect(mb._generateBoundary()).to.equal('----sinikael-?=_abc-def');
});
});
describe('#_encodeHeaderValue', function() {
it('should do noting if possible', function() {
var mb = new Buildmail();
expect(mb._encodeHeaderValue('x-my', 'test value')).to.equal('test value');
});
it('should encode non ascii characters', function() {
var mb = new Buildmail();
expect(mb._encodeHeaderValue('x-my', 'test jõgeva value')).to.equal('test =?UTF-8?Q?j=C3=B5geva?= value');
});
it('should format references', function() {
var mb = new Buildmail();
expect(mb._encodeHeaderValue('references', 'abc def')).to.equal('<abc> <def>');
expect(mb._encodeHeaderValue('references', ['abc', 'def'])).to.equal('<abc> <def>');
});
it('should format message-id', function() {
var mb = new Buildmail();
expect(mb._encodeHeaderValue('message-id', 'abc')).to.equal('<abc>');
});
it('should format addresses', function() {
var mb = new Buildmail();
expect(mb._encodeHeaderValue('from', {
name: 'the safewithme testuser',
address: 'safewithme.testuser@jõgeva.com'
})).to.equal('the safewithme testuser <safewithme.testuser@xn--jgeva-dua.com>');
});
});
describe('#_convertAddresses', function() {
it('should convert address object to a string', function() {
var mb = new Buildmail();
expect(mb._convertAddresses([{
name: 'Jõgeva Ants',
address: 'ants@jõgeva.ee'
}, {
name: 'Composers',
group: [{
address: 'sebu@example.com',
name: 'Bach, Sebastian'
}, {
address: 'mozart@example.com',
name: 'Mozzie'
}]
}])).to.equal('=?UTF-8?Q?J=C3=B5geva_Ants?= <ants@xn--jgeva-dua.ee>, Composers:"Bach, Sebastian" <sebu@example.com>, Mozzie <mozart@example.com>;');
});
it('should keep ascii name as is', function() {
var mb = new Buildmail();
expect(mb._convertAddresses([{
name: 'O\'Vigala Sass',
address: 'a@b.c'
}])).to.equal('O\'Vigala Sass <a@b.c>');
});
it('should include name in quotes for special symbols', function() {
var mb = new Buildmail();
expect(mb._convertAddresses([{
name: 'Sass, Vigala',
address: 'a@b.c'
}])).to.equal('"Sass, Vigala" <a@b.c>');
});
it('should escape quotes', function() {
var mb = new Buildmail();
expect(mb._convertAddresses([{
name: '"Vigala Sass"',
address: 'a@b.c'
}])).to.equal('"\\"Vigala Sass\\"" <a@b.c>');
});
it('should mime encode unicode names', function() {
var mb = new Buildmail();
expect(mb._convertAddresses([{
name: '"Jõgeva Sass"',
address: 'a@b.c'
}])).to.equal('=?UTF-8?Q?=22J=C3=B5geva_Sass=22?= <a@b.c>');
});
});
describe('HTTP streaming', function() {
var port = 10337;
var server;
beforeEach(function(done) {
server = http.createServer(function(req, res) {
res.writeHead(200, {
'Content-Type': 'text/plain'
});
var data = new Buffer(new Array(1024 + 1).join('ä'), 'utf-8');
var i = 0;
var sendByte = function() {
if (i >= data.length) {
return res.end();
}
res.write(new Buffer([data[i++]]));
setImmediate(sendByte);
};
sendByte();
});
server.listen(port, done);
});
afterEach(function(done) {
server.close(done);
});
it('should pipe URL as an attachment', function(done) {
var mb = new Buildmail('text/plain').
setContent({
href: 'http://localhost:' + port
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/^=C3=A4/m.test(msg)).to.be.true;
done();
});
});
it('#should not throw on error', function(done) {
var mb = new Buildmail('text/plain').
setContent({
href: 'http://__should_not_exist:58888'
});
mb.build(function(err, msg) {
msg = msg.toString();
expect(/ENOTFOUND/.test(msg)).to.be.true;
done();
});
});
});
describe('#transform', function() {
it('should pipe through provided stream', function(done) {
var mb = new Buildmail('text/plain').
setHeader({
date: '12345',
'message-id': '67890'
}).
setContent('Hello world!');
var expected = 'Content-Type:\ttext/plain\r\n' +
'Date:\t12345\r\n' +
'Message-Id:\t<67890>\r\n' +
'Content-Transfer-Encoding:\t7bit\r\n' +
'MIME-Version:\t1.0\r\n' +
'\r\n' +
'Hello\tworld!';
// Transform stream that replaces all spaces with tabs
var transform = new Transform();
transform._transform = function(chunk, encoding, done) {
if (encoding !== 'buffer') {
chunk = new Buffer(chunk, encoding);
}
for (var i = 0, len = chunk.length; i < len; i++) {
if (chunk[i] === 0x20) {
chunk[i] = 0x09;
}
}
this.push(chunk);
done();
};
mb.transform(transform);
mb.build(function(err, msg) {
msg = msg.toString();
expect(msg).to.equal(expected);
done();
});
});
});
});

View File

@@ -0,0 +1,4 @@
language: node_js
node_js:
- 0.8
- "0.10"

View File

@@ -0,0 +1,18 @@
This software is released under the MIT license:
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,20 @@
var http = require('http');
var hyperquest = require('../');
var server = http.createServer(function (req, res) {
res.write(req.url.slice(1) + '\n');
setTimeout(res.end.bind(res), 3000);
});
server.listen(5000, function () {
var pending = 20;
for (var i = 0; i < 20; i++) {
var r = hyperquest('http://localhost:5000/' + i);
r.pipe(process.stdout, { end: false });
r.on('end', function () {
if (--pending === 0) server.close();
});
}
});
process.stdout.setMaxListeners(0); // turn off annoying warnings

View File

@@ -0,0 +1,20 @@
var http = require('http');
var request = require('request');
var server = http.createServer(function (req, res) {
res.write(req.url.slice(1) + '\n');
setTimeout(res.end.bind(res), 3000);
});
server.listen(5000, function () {
var pending = 20;
for (var i = 0; i < 20; i++) {
var r = request('http://localhost:5000/' + i);
r.pipe(process.stdout, { end: false });
r.on('end', function () {
if (--pending === 0) server.close();
});
}
});
process.stdout.setMaxListeners(0); // turn off annoying warnings

View File

@@ -0,0 +1,2 @@
var hyperquest = require('../');
hyperquest('http://localhost:8000').pipe(process.stdout);

View File

@@ -0,0 +1,151 @@
var url = require('url');
var http = require('http');
var https = require('https');
var through = require('through2');
var duplexer = require('duplexer2');
module.exports = hyperquest;
function bind (obj, fn) {
var args = Array.prototype.slice.call(arguments, 2);
return function () {
var argv = args.concat(Array.prototype.slice.call(arguments));
return fn.apply(obj, argv);
}
}
function hyperquest (uri, opts, cb, extra) {
if (typeof uri === 'object') {
cb = opts;
opts = uri;
uri = undefined;
}
if (typeof opts === 'function') {
cb = opts;
opts = undefined;
}
if (!opts) opts = {};
if (uri !== undefined) opts.uri = uri;
if (extra) opts.method = extra.method;
var req = new Req(opts);
var ws = req.duplex && through();
var rs = through();
var dup = req.duplex ? duplexer(ws, rs) : rs;
if (!req.duplex) {
rs.writable = false;
}
dup.request = req;
dup.setHeader = bind(req, req.setHeader);
dup.setLocation = bind(req, req.setLocation);
var closed = false;
dup.on('close', function () { closed = true });
process.nextTick(function () {
if (closed) return;
dup.on('close', function () { r.destroy() });
var r = req._send();
r.on('error', bind(dup, dup.emit, 'error'));
dup.emit('request', r);
r.on('response', function (res) {
dup.response = res;
dup.emit('response', res);
if (req.duplex) res.pipe(rs)
else {
res.on('data', function (buf) { rs.push(buf) });
res.on('end', function () { rs.push(null) });
}
});
if (req.duplex) {
ws.pipe(r);
}
else r.end();
});
if (cb) {
dup.on('error', cb);
dup.on('response', bind(dup, cb, null));
}
return dup;
}
hyperquest.get = hyperquest;
hyperquest.post = function (uri, opts, cb) {
return hyperquest(uri, opts, cb, { method: 'POST' });
};
hyperquest.put = function (uri, opts, cb) {
return hyperquest(uri, opts, cb, { method: 'PUT' });
};
hyperquest['delete'] = function (uri, opts, cb) {
return hyperquest(uri, opts, cb, { method: 'DELETE' });
};
function Req (opts) {
this.headers = opts.headers || {};
var method = (opts.method || 'GET').toUpperCase();
this.method = method;
this.duplex = !(method === 'GET' || method === 'DELETE'
|| method === 'HEAD');
this.auth = opts.auth;
this.options = opts;
if (opts.uri) this.setLocation(opts.uri);
}
Req.prototype._send = function () {
this._sent = true;
var headers = this.headers || {};
var u = url.parse(this.uri);
var au = u.auth || this.auth;
if (au) {
headers.authorization = 'Basic ' + Buffer(au).toString('base64');
}
var protocol = u.protocol || '';
var iface = protocol === 'https:' ? https : http;
var opts = {
scheme: protocol.replace(/:$/, ''),
method: this.method,
host: u.hostname,
port: Number(u.port) || (protocol === 'https:' ? 443 : 80),
path: u.path,
agent: this.options.agent || false,
headers: headers,
withCredentials: this.options.withCredentials
};
if (protocol === 'https:') {
opts.pfx = this.options.pfx;
opts.key = this.options.key;
opts.cert = this.options.cert;
opts.ca = this.options.ca;
opts.ciphers = this.options.ciphers;
opts.rejectUnauthorized = this.options.rejectUnauthorized;
opts.secureProtocol = this.options.secureProtocol;
}
var req = iface.request(opts);
if (req.setTimeout) req.setTimeout(Math.pow(2, 32) * 1000);
return req;
};
Req.prototype.setHeader = function (key, value) {
if (this._sent) throw new Error('request already sent');
this.headers[key] = value;
return this;
};
Req.prototype.setLocation = function (uri) {
this.uri = uri;
return this;
};

View File

@@ -0,0 +1 @@
/node_modules

View File

@@ -0,0 +1,3 @@
language: node_js
node_js:
- "0.10"

View File

@@ -0,0 +1,26 @@
Copyright (c) 2013, Deoxxa Development
======================================
All rights reserved.
--------------------
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. Neither the name of Deoxxa Development nor the names of its contributors
may be used to endorse or promote products derived from this software
without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY DEOXXA DEVELOPMENT ''AS IS'' AND ANY
EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL DEOXXA DEVELOPMENT BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

View File

@@ -0,0 +1,129 @@
duplexer2 [![build status](https://travis-ci.org/deoxxa/duplexer2.png)](https://travis-ci.org/deoxxa/fork)
=========
Like duplexer (http://npm.im/duplexer) but using streams2.
Overview
--------
duplexer2 is a reimplementation of [duplexer](http://npm.im/duplexer) using the
readable-stream API which is standard in node as of v0.10. Everything largely
works the same.
Installation
------------
Available via [npm](http://npmjs.org/):
> $ npm install duplexer2
Or via git:
> $ git clone git://github.com/deoxxa/duplexer2.git node_modules/duplexer2
API
---
**duplexer2**
Creates a new `DuplexWrapper` object, which is the actual class that implements
most of the fun stuff. All that fun stuff is hidden. DON'T LOOK.
```javascript
duplexer2([options], writable, readable)
```
```javascript
var duplex = duplexer2(new stream.Writable(), new stream.Readable());
```
Arguments
* __options__ - an object specifying the regular `stream.Duplex` options, as
well as the properties described below.
* __writable__ - a writable stream
* __readable__ - a readable stream
Options
* __bubbleErrors__ - a boolean value that specifies whether to bubble errors
from the underlying readable/writable streams. Default is `true`.
Example
-------
Also see [example.js](https://github.com/deoxxa/duplexer2/blob/master/example.js).
Code:
```javascript
var stream = require("stream");
var duplexer2 = require("duplexer2");
var writable = new stream.Writable({objectMode: true}),
readable = new stream.Readable({objectMode: true});
writable._write = function _write(input, encoding, done) {
if (readable.push(input)) {
return done();
} else {
readable.once("drain", done);
}
};
readable._read = function _read(n) {
// no-op
};
// simulate the readable thing closing after a bit
writable.once("finish", function() {
setTimeout(function() {
readable.push(null);
}, 500);
});
var duplex = duplexer2(writable, readable);
duplex.on("data", function(e) {
console.log("got data", JSON.stringify(e));
});
duplex.on("finish", function() {
console.log("got finish event");
});
duplex.on("end", function() {
console.log("got end event");
});
duplex.write("oh, hi there", function() {
console.log("finished writing");
});
duplex.end(function() {
console.log("finished ending");
});
```
Output:
```
got data "oh, hi there"
finished writing
got finish event
finished ending
got end event
```
License
-------
3-clause BSD. A copy is included with the source.
Contact
-------
* GitHub ([deoxxa](http://github.com/deoxxa))
* Twitter ([@deoxxa](http://twitter.com/deoxxa))
* Email ([deoxxa@fknsrs.biz](mailto:deoxxa@fknsrs.biz))

View File

@@ -0,0 +1,49 @@
#!/usr/bin/env node
var stream = require("readable-stream");
var duplexer2 = require("./");
var writable = new stream.Writable({objectMode: true}),
readable = new stream.Readable({objectMode: true});
writable._write = function _write(input, encoding, done) {
if (readable.push(input)) {
return done();
} else {
readable.once("drain", done);
}
};
readable._read = function _read(n) {
// no-op
};
// simulate the readable thing closing after a bit
writable.once("finish", function() {
setTimeout(function() {
readable.push(null);
}, 500);
});
var duplex = duplexer2(writable, readable);
duplex.on("data", function(e) {
console.log("got data", JSON.stringify(e));
});
duplex.on("finish", function() {
console.log("got finish event");
});
duplex.on("end", function() {
console.log("got end event");
});
duplex.write("oh, hi there", function() {
console.log("finished writing");
});
duplex.end(function() {
console.log("finished ending");
});

View File

@@ -0,0 +1,62 @@
var stream = require("readable-stream");
var duplex2 = module.exports = function duplex2(options, writable, readable) {
return new DuplexWrapper(options, writable, readable);
};
var DuplexWrapper = exports.DuplexWrapper = function DuplexWrapper(options, writable, readable) {
if (typeof readable === "undefined") {
readable = writable;
writable = options;
options = null;
}
options = options || {};
options.objectMode = true;
stream.Duplex.call(this, options);
this._bubbleErrors = (typeof options.bubbleErrors === "undefined") || !!options.bubbleErrors;
this._writable = writable;
this._readable = readable;
var self = this;
writable.once("finish", function() {
self.end();
});
this.once("finish", function() {
writable.end();
});
readable.on("data", function(e) {
if (!self.push(e)) {
readable.pause();
}
});
readable.once("end", function() {
return self.push(null);
});
if (this._bubbleErrors) {
writable.on("error", function(err) {
return self.emit("error", err);
});
readable.on("error", function(err) {
return self.emit("error", err);
});
}
};
DuplexWrapper.prototype = Object.create(stream.Duplex.prototype, {constructor: {value: DuplexWrapper}});
DuplexWrapper.prototype._write = function _write(input, encoding, done) {
this._writable.write(input, encoding, done);
};
DuplexWrapper.prototype._read = function _read(n) {
this._readable.resume();
};

View File

@@ -0,0 +1,5 @@
build/
test/
examples/
fs.js
zlib.js

View File

@@ -0,0 +1,18 @@
Copyright Joyent, Inc. and other Node contributors. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to
deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.

View File

@@ -0,0 +1,15 @@
# readable-stream
***Node-core streams for userland***
[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)
[![NPM](https://nodei.co/npm-dl/readable-stream.png&months=6&height=3)](https://nodei.co/npm/readable-stream/)
This package is a mirror of the Streams2 and Streams3 implementations in Node-core.
If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core.
**readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12.
**readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` youll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when youre ready to start using Streams3, pin to `"~1.1.0"`

View File

@@ -0,0 +1 @@
module.exports = require("./lib/_stream_duplex.js")

View File

@@ -0,0 +1,923 @@
diff --git a/lib/_stream_duplex.js b/lib/_stream_duplex.js
index c5a741c..a2e0d8e 100644
--- a/lib/_stream_duplex.js
+++ b/lib/_stream_duplex.js
@@ -26,8 +26,8 @@
module.exports = Duplex;
var util = require('util');
-var Readable = require('_stream_readable');
-var Writable = require('_stream_writable');
+var Readable = require('./_stream_readable');
+var Writable = require('./_stream_writable');
util.inherits(Duplex, Readable);
diff --git a/lib/_stream_passthrough.js b/lib/_stream_passthrough.js
index a5e9864..330c247 100644
--- a/lib/_stream_passthrough.js
+++ b/lib/_stream_passthrough.js
@@ -25,7 +25,7 @@
module.exports = PassThrough;
-var Transform = require('_stream_transform');
+var Transform = require('./_stream_transform');
var util = require('util');
util.inherits(PassThrough, Transform);
diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js
index 0c3fe3e..90a8298 100644
--- a/lib/_stream_readable.js
+++ b/lib/_stream_readable.js
@@ -23,10 +23,34 @@ module.exports = Readable;
Readable.ReadableState = ReadableState;
var EE = require('events').EventEmitter;
+if (!EE.listenerCount) EE.listenerCount = function(emitter, type) {
+ return emitter.listeners(type).length;
+};
+
+if (!global.setImmediate) global.setImmediate = function setImmediate(fn) {
+ return setTimeout(fn, 0);
+};
+if (!global.clearImmediate) global.clearImmediate = function clearImmediate(i) {
+ return clearTimeout(i);
+};
+
var Stream = require('stream');
var util = require('util');
+if (!util.isUndefined) {
+ var utilIs = require('core-util-is');
+ for (var f in utilIs) {
+ util[f] = utilIs[f];
+ }
+}
var StringDecoder;
-var debug = util.debuglog('stream');
+var debug;
+if (util.debuglog)
+ debug = util.debuglog('stream');
+else try {
+ debug = require('debuglog')('stream');
+} catch (er) {
+ debug = function() {};
+}
util.inherits(Readable, Stream);
@@ -380,7 +404,7 @@ function chunkInvalid(state, chunk) {
function onEofChunk(stream, state) {
- if (state.decoder && !state.ended) {
+ if (state.decoder && !state.ended && state.decoder.end) {
var chunk = state.decoder.end();
if (chunk && chunk.length) {
state.buffer.push(chunk);
diff --git a/lib/_stream_transform.js b/lib/_stream_transform.js
index b1f9fcc..b0caf57 100644
--- a/lib/_stream_transform.js
+++ b/lib/_stream_transform.js
@@ -64,8 +64,14 @@
module.exports = Transform;
-var Duplex = require('_stream_duplex');
+var Duplex = require('./_stream_duplex');
var util = require('util');
+if (!util.isUndefined) {
+ var utilIs = require('core-util-is');
+ for (var f in utilIs) {
+ util[f] = utilIs[f];
+ }
+}
util.inherits(Transform, Duplex);
diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js
index ba2e920..f49288b 100644
--- a/lib/_stream_writable.js
+++ b/lib/_stream_writable.js
@@ -27,6 +27,12 @@ module.exports = Writable;
Writable.WritableState = WritableState;
var util = require('util');
+if (!util.isUndefined) {
+ var utilIs = require('core-util-is');
+ for (var f in utilIs) {
+ util[f] = utilIs[f];
+ }
+}
var Stream = require('stream');
util.inherits(Writable, Stream);
@@ -119,7 +125,7 @@ function WritableState(options, stream) {
function Writable(options) {
// Writable ctor is applied to Duplexes, though they're not
// instanceof Writable, they're instanceof Readable.
- if (!(this instanceof Writable) && !(this instanceof Stream.Duplex))
+ if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex')))
return new Writable(options);
this._writableState = new WritableState(options, this);
diff --git a/test/simple/test-stream-big-push.js b/test/simple/test-stream-big-push.js
index e3787e4..8cd2127 100644
--- a/test/simple/test-stream-big-push.js
+++ b/test/simple/test-stream-big-push.js
@@ -21,7 +21,7 @@
var common = require('../common');
var assert = require('assert');
-var stream = require('stream');
+var stream = require('../../');
var str = 'asdfasdfasdfasdfasdf';
var r = new stream.Readable({
diff --git a/test/simple/test-stream-end-paused.js b/test/simple/test-stream-end-paused.js
index bb73777..d40efc7 100644
--- a/test/simple/test-stream-end-paused.js
+++ b/test/simple/test-stream-end-paused.js
@@ -25,7 +25,7 @@ var gotEnd = false;
// Make sure we don't miss the end event for paused 0-length streams
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
var stream = new Readable();
var calledRead = false;
stream._read = function() {
diff --git a/test/simple/test-stream-pipe-after-end.js b/test/simple/test-stream-pipe-after-end.js
index b46ee90..0be8366 100644
--- a/test/simple/test-stream-pipe-after-end.js
+++ b/test/simple/test-stream-pipe-after-end.js
@@ -22,8 +22,8 @@
var common = require('../common');
var assert = require('assert');
-var Readable = require('_stream_readable');
-var Writable = require('_stream_writable');
+var Readable = require('../../lib/_stream_readable');
+var Writable = require('../../lib/_stream_writable');
var util = require('util');
util.inherits(TestReadable, Readable);
diff --git a/test/simple/test-stream-pipe-cleanup.js b/test/simple/test-stream-pipe-cleanup.js
deleted file mode 100644
index f689358..0000000
--- a/test/simple/test-stream-pipe-cleanup.js
+++ /dev/null
@@ -1,122 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-// This test asserts that Stream.prototype.pipe does not leave listeners
-// hanging on the source or dest.
-
-var common = require('../common');
-var stream = require('stream');
-var assert = require('assert');
-var util = require('util');
-
-function Writable() {
- this.writable = true;
- this.endCalls = 0;
- stream.Stream.call(this);
-}
-util.inherits(Writable, stream.Stream);
-Writable.prototype.end = function() {
- this.endCalls++;
-};
-
-Writable.prototype.destroy = function() {
- this.endCalls++;
-};
-
-function Readable() {
- this.readable = true;
- stream.Stream.call(this);
-}
-util.inherits(Readable, stream.Stream);
-
-function Duplex() {
- this.readable = true;
- Writable.call(this);
-}
-util.inherits(Duplex, Writable);
-
-var i = 0;
-var limit = 100;
-
-var w = new Writable();
-
-var r;
-
-for (i = 0; i < limit; i++) {
- r = new Readable();
- r.pipe(w);
- r.emit('end');
-}
-assert.equal(0, r.listeners('end').length);
-assert.equal(limit, w.endCalls);
-
-w.endCalls = 0;
-
-for (i = 0; i < limit; i++) {
- r = new Readable();
- r.pipe(w);
- r.emit('close');
-}
-assert.equal(0, r.listeners('close').length);
-assert.equal(limit, w.endCalls);
-
-w.endCalls = 0;
-
-r = new Readable();
-
-for (i = 0; i < limit; i++) {
- w = new Writable();
- r.pipe(w);
- w.emit('close');
-}
-assert.equal(0, w.listeners('close').length);
-
-r = new Readable();
-w = new Writable();
-var d = new Duplex();
-r.pipe(d); // pipeline A
-d.pipe(w); // pipeline B
-assert.equal(r.listeners('end').length, 2); // A.onend, A.cleanup
-assert.equal(r.listeners('close').length, 2); // A.onclose, A.cleanup
-assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup
-assert.equal(d.listeners('close').length, 3); // A.cleanup, B.onclose, B.cleanup
-assert.equal(w.listeners('end').length, 0);
-assert.equal(w.listeners('close').length, 1); // B.cleanup
-
-r.emit('end');
-assert.equal(d.endCalls, 1);
-assert.equal(w.endCalls, 0);
-assert.equal(r.listeners('end').length, 0);
-assert.equal(r.listeners('close').length, 0);
-assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup
-assert.equal(d.listeners('close').length, 2); // B.onclose, B.cleanup
-assert.equal(w.listeners('end').length, 0);
-assert.equal(w.listeners('close').length, 1); // B.cleanup
-
-d.emit('end');
-assert.equal(d.endCalls, 1);
-assert.equal(w.endCalls, 1);
-assert.equal(r.listeners('end').length, 0);
-assert.equal(r.listeners('close').length, 0);
-assert.equal(d.listeners('end').length, 0);
-assert.equal(d.listeners('close').length, 0);
-assert.equal(w.listeners('end').length, 0);
-assert.equal(w.listeners('close').length, 0);
diff --git a/test/simple/test-stream-pipe-error-handling.js b/test/simple/test-stream-pipe-error-handling.js
index c5d724b..c7d6b7d 100644
--- a/test/simple/test-stream-pipe-error-handling.js
+++ b/test/simple/test-stream-pipe-error-handling.js
@@ -21,7 +21,7 @@
var common = require('../common');
var assert = require('assert');
-var Stream = require('stream').Stream;
+var Stream = require('../../').Stream;
(function testErrorListenerCatches() {
var source = new Stream();
diff --git a/test/simple/test-stream-pipe-event.js b/test/simple/test-stream-pipe-event.js
index cb9d5fe..56f8d61 100644
--- a/test/simple/test-stream-pipe-event.js
+++ b/test/simple/test-stream-pipe-event.js
@@ -20,7 +20,7 @@
// USE OR OTHER DEALINGS IN THE SOFTWARE.
var common = require('../common');
-var stream = require('stream');
+var stream = require('../../');
var assert = require('assert');
var util = require('util');
diff --git a/test/simple/test-stream-push-order.js b/test/simple/test-stream-push-order.js
index f2e6ec2..a5c9bf9 100644
--- a/test/simple/test-stream-push-order.js
+++ b/test/simple/test-stream-push-order.js
@@ -20,7 +20,7 @@
// USE OR OTHER DEALINGS IN THE SOFTWARE.
var common = require('../common.js');
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
var assert = require('assert');
var s = new Readable({
diff --git a/test/simple/test-stream-push-strings.js b/test/simple/test-stream-push-strings.js
index 06f43dc..1701a9a 100644
--- a/test/simple/test-stream-push-strings.js
+++ b/test/simple/test-stream-push-strings.js
@@ -22,7 +22,7 @@
var common = require('../common');
var assert = require('assert');
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
var util = require('util');
util.inherits(MyStream, Readable);
diff --git a/test/simple/test-stream-readable-event.js b/test/simple/test-stream-readable-event.js
index ba6a577..a8e6f7b 100644
--- a/test/simple/test-stream-readable-event.js
+++ b/test/simple/test-stream-readable-event.js
@@ -22,7 +22,7 @@
var common = require('../common');
var assert = require('assert');
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
(function first() {
// First test, not reading when the readable is added.
diff --git a/test/simple/test-stream-readable-flow-recursion.js b/test/simple/test-stream-readable-flow-recursion.js
index 2891ad6..11689ba 100644
--- a/test/simple/test-stream-readable-flow-recursion.js
+++ b/test/simple/test-stream-readable-flow-recursion.js
@@ -27,7 +27,7 @@ var assert = require('assert');
// more data continuously, but without triggering a nextTick
// warning or RangeError.
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
// throw an error if we trigger a nextTick warning.
process.throwDeprecation = true;
diff --git a/test/simple/test-stream-unshift-empty-chunk.js b/test/simple/test-stream-unshift-empty-chunk.js
index 0c96476..7827538 100644
--- a/test/simple/test-stream-unshift-empty-chunk.js
+++ b/test/simple/test-stream-unshift-empty-chunk.js
@@ -24,7 +24,7 @@ var assert = require('assert');
// This test verifies that stream.unshift(Buffer(0)) or
// stream.unshift('') does not set state.reading=false.
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
var r = new Readable();
var nChunks = 10;
diff --git a/test/simple/test-stream-unshift-read-race.js b/test/simple/test-stream-unshift-read-race.js
index 83fd9fa..17c18aa 100644
--- a/test/simple/test-stream-unshift-read-race.js
+++ b/test/simple/test-stream-unshift-read-race.js
@@ -29,7 +29,7 @@ var assert = require('assert');
// 3. push() after the EOF signaling null is an error.
// 4. _read() is not called after pushing the EOF null chunk.
-var stream = require('stream');
+var stream = require('../../');
var hwm = 10;
var r = stream.Readable({ highWaterMark: hwm });
var chunks = 10;
@@ -51,7 +51,14 @@ r._read = function(n) {
function push(fast) {
assert(!pushedNull, 'push() after null push');
- var c = pos >= data.length ? null : data.slice(pos, pos + n);
+ var c;
+ if (pos >= data.length)
+ c = null;
+ else {
+ if (n + pos > data.length)
+ n = data.length - pos;
+ c = data.slice(pos, pos + n);
+ }
pushedNull = c === null;
if (fast) {
pos += n;
diff --git a/test/simple/test-stream-writev.js b/test/simple/test-stream-writev.js
index 5b49e6e..b5321f3 100644
--- a/test/simple/test-stream-writev.js
+++ b/test/simple/test-stream-writev.js
@@ -22,7 +22,7 @@
var common = require('../common');
var assert = require('assert');
-var stream = require('stream');
+var stream = require('../../');
var queue = [];
for (var decode = 0; decode < 2; decode++) {
diff --git a/test/simple/test-stream2-basic.js b/test/simple/test-stream2-basic.js
index 3814bf0..248c1be 100644
--- a/test/simple/test-stream2-basic.js
+++ b/test/simple/test-stream2-basic.js
@@ -21,7 +21,7 @@
var common = require('../common.js');
-var R = require('_stream_readable');
+var R = require('../../lib/_stream_readable');
var assert = require('assert');
var util = require('util');
diff --git a/test/simple/test-stream2-compatibility.js b/test/simple/test-stream2-compatibility.js
index 6cdd4e9..f0fa84b 100644
--- a/test/simple/test-stream2-compatibility.js
+++ b/test/simple/test-stream2-compatibility.js
@@ -21,7 +21,7 @@
var common = require('../common.js');
-var R = require('_stream_readable');
+var R = require('../../lib/_stream_readable');
var assert = require('assert');
var util = require('util');
diff --git a/test/simple/test-stream2-finish-pipe.js b/test/simple/test-stream2-finish-pipe.js
index 39b274f..006a19b 100644
--- a/test/simple/test-stream2-finish-pipe.js
+++ b/test/simple/test-stream2-finish-pipe.js
@@ -20,7 +20,7 @@
// USE OR OTHER DEALINGS IN THE SOFTWARE.
var common = require('../common.js');
-var stream = require('stream');
+var stream = require('../../');
var Buffer = require('buffer').Buffer;
var r = new stream.Readable();
diff --git a/test/simple/test-stream2-fs.js b/test/simple/test-stream2-fs.js
deleted file mode 100644
index e162406..0000000
--- a/test/simple/test-stream2-fs.js
+++ /dev/null
@@ -1,72 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-
-var common = require('../common.js');
-var R = require('_stream_readable');
-var assert = require('assert');
-
-var fs = require('fs');
-var FSReadable = fs.ReadStream;
-
-var path = require('path');
-var file = path.resolve(common.fixturesDir, 'x1024.txt');
-
-var size = fs.statSync(file).size;
-
-var expectLengths = [1024];
-
-var util = require('util');
-var Stream = require('stream');
-
-util.inherits(TestWriter, Stream);
-
-function TestWriter() {
- Stream.apply(this);
- this.buffer = [];
- this.length = 0;
-}
-
-TestWriter.prototype.write = function(c) {
- this.buffer.push(c.toString());
- this.length += c.length;
- return true;
-};
-
-TestWriter.prototype.end = function(c) {
- if (c) this.buffer.push(c.toString());
- this.emit('results', this.buffer);
-}
-
-var r = new FSReadable(file);
-var w = new TestWriter();
-
-w.on('results', function(res) {
- console.error(res, w.length);
- assert.equal(w.length, size);
- var l = 0;
- assert.deepEqual(res.map(function (c) {
- return c.length;
- }), expectLengths);
- console.log('ok');
-});
-
-r.pipe(w);
diff --git a/test/simple/test-stream2-httpclient-response-end.js b/test/simple/test-stream2-httpclient-response-end.js
deleted file mode 100644
index 15cffc2..0000000
--- a/test/simple/test-stream2-httpclient-response-end.js
+++ /dev/null
@@ -1,52 +0,0 @@
-// Copyright Joyent, Inc. and other Node contributors.
-//
-// Permission is hereby granted, free of charge, to any person obtaining a
-// copy of this software and associated documentation files (the
-// "Software"), to deal in the Software without restriction, including
-// without limitation the rights to use, copy, modify, merge, publish,
-// distribute, sublicense, and/or sell copies of the Software, and to permit
-// persons to whom the Software is furnished to do so, subject to the
-// following conditions:
-//
-// The above copyright notice and this permission notice shall be included
-// in all copies or substantial portions of the Software.
-//
-// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
-// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
-// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
-// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
-// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
-// USE OR OTHER DEALINGS IN THE SOFTWARE.
-
-var common = require('../common.js');
-var assert = require('assert');
-var http = require('http');
-var msg = 'Hello';
-var readable_event = false;
-var end_event = false;
-var server = http.createServer(function(req, res) {
- res.writeHead(200, {'Content-Type': 'text/plain'});
- res.end(msg);
-}).listen(common.PORT, function() {
- http.get({port: common.PORT}, function(res) {
- var data = '';
- res.on('readable', function() {
- console.log('readable event');
- readable_event = true;
- data += res.read();
- });
- res.on('end', function() {
- console.log('end event');
- end_event = true;
- assert.strictEqual(msg, data);
- server.close();
- });
- });
-});
-
-process.on('exit', function() {
- assert(readable_event);
- assert(end_event);
-});
-
diff --git a/test/simple/test-stream2-large-read-stall.js b/test/simple/test-stream2-large-read-stall.js
index 2fbfbca..667985b 100644
--- a/test/simple/test-stream2-large-read-stall.js
+++ b/test/simple/test-stream2-large-read-stall.js
@@ -30,7 +30,7 @@ var PUSHSIZE = 20;
var PUSHCOUNT = 1000;
var HWM = 50;
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
var r = new Readable({
highWaterMark: HWM
});
@@ -39,23 +39,23 @@ var rs = r._readableState;
r._read = push;
r.on('readable', function() {
- console.error('>> readable');
+ //console.error('>> readable');
do {
- console.error(' > read(%d)', READSIZE);
+ //console.error(' > read(%d)', READSIZE);
var ret = r.read(READSIZE);
- console.error(' < %j (%d remain)', ret && ret.length, rs.length);
+ //console.error(' < %j (%d remain)', ret && ret.length, rs.length);
} while (ret && ret.length === READSIZE);
- console.error('<< after read()',
- ret && ret.length,
- rs.needReadable,
- rs.length);
+ //console.error('<< after read()',
+ // ret && ret.length,
+ // rs.needReadable,
+ // rs.length);
});
var endEmitted = false;
r.on('end', function() {
endEmitted = true;
- console.error('end');
+ //console.error('end');
});
var pushes = 0;
@@ -64,11 +64,11 @@ function push() {
return;
if (pushes++ === PUSHCOUNT) {
- console.error(' push(EOF)');
+ //console.error(' push(EOF)');
return r.push(null);
}
- console.error(' push #%d', pushes);
+ //console.error(' push #%d', pushes);
if (r.push(new Buffer(PUSHSIZE)))
setTimeout(push);
}
diff --git a/test/simple/test-stream2-objects.js b/test/simple/test-stream2-objects.js
index 3e6931d..ff47d89 100644
--- a/test/simple/test-stream2-objects.js
+++ b/test/simple/test-stream2-objects.js
@@ -21,8 +21,8 @@
var common = require('../common.js');
-var Readable = require('_stream_readable');
-var Writable = require('_stream_writable');
+var Readable = require('../../lib/_stream_readable');
+var Writable = require('../../lib/_stream_writable');
var assert = require('assert');
// tiny node-tap lookalike.
diff --git a/test/simple/test-stream2-pipe-error-handling.js b/test/simple/test-stream2-pipe-error-handling.js
index cf7531c..e3f3e4e 100644
--- a/test/simple/test-stream2-pipe-error-handling.js
+++ b/test/simple/test-stream2-pipe-error-handling.js
@@ -21,7 +21,7 @@
var common = require('../common');
var assert = require('assert');
-var stream = require('stream');
+var stream = require('../../');
(function testErrorListenerCatches() {
var count = 1000;
diff --git a/test/simple/test-stream2-pipe-error-once-listener.js b/test/simple/test-stream2-pipe-error-once-listener.js
index 5e8e3cb..53b2616 100755
--- a/test/simple/test-stream2-pipe-error-once-listener.js
+++ b/test/simple/test-stream2-pipe-error-once-listener.js
@@ -24,7 +24,7 @@ var common = require('../common.js');
var assert = require('assert');
var util = require('util');
-var stream = require('stream');
+var stream = require('../../');
var Read = function() {
diff --git a/test/simple/test-stream2-push.js b/test/simple/test-stream2-push.js
index b63edc3..eb2b0e9 100644
--- a/test/simple/test-stream2-push.js
+++ b/test/simple/test-stream2-push.js
@@ -20,7 +20,7 @@
// USE OR OTHER DEALINGS IN THE SOFTWARE.
var common = require('../common.js');
-var stream = require('stream');
+var stream = require('../../');
var Readable = stream.Readable;
var Writable = stream.Writable;
var assert = require('assert');
diff --git a/test/simple/test-stream2-read-sync-stack.js b/test/simple/test-stream2-read-sync-stack.js
index e8a7305..9740a47 100644
--- a/test/simple/test-stream2-read-sync-stack.js
+++ b/test/simple/test-stream2-read-sync-stack.js
@@ -21,7 +21,7 @@
var common = require('../common');
var assert = require('assert');
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
var r = new Readable();
var N = 256 * 1024;
diff --git a/test/simple/test-stream2-readable-empty-buffer-no-eof.js b/test/simple/test-stream2-readable-empty-buffer-no-eof.js
index cd30178..4b1659d 100644
--- a/test/simple/test-stream2-readable-empty-buffer-no-eof.js
+++ b/test/simple/test-stream2-readable-empty-buffer-no-eof.js
@@ -22,10 +22,9 @@
var common = require('../common');
var assert = require('assert');
-var Readable = require('stream').Readable;
+var Readable = require('../../').Readable;
test1();
-test2();
function test1() {
var r = new Readable();
@@ -88,31 +87,3 @@ function test1() {
console.log('ok');
});
}
-
-function test2() {
- var r = new Readable({ encoding: 'base64' });
- var reads = 5;
- r._read = function(n) {
- if (!reads--)
- return r.push(null); // EOF
- else
- return r.push(new Buffer('x'));
- };
-
- var results = [];
- function flow() {
- var chunk;
- while (null !== (chunk = r.read()))
- results.push(chunk + '');
- }
- r.on('readable', flow);
- r.on('end', function() {
- results.push('EOF');
- });
- flow();
-
- process.on('exit', function() {
- assert.deepEqual(results, [ 'eHh4', 'eHg=', 'EOF' ]);
- console.log('ok');
- });
-}
diff --git a/test/simple/test-stream2-readable-from-list.js b/test/simple/test-stream2-readable-from-list.js
index 7c96ffe..04a96f5 100644
--- a/test/simple/test-stream2-readable-from-list.js
+++ b/test/simple/test-stream2-readable-from-list.js
@@ -21,7 +21,7 @@
var assert = require('assert');
var common = require('../common.js');
-var fromList = require('_stream_readable')._fromList;
+var fromList = require('../../lib/_stream_readable')._fromList;
// tiny node-tap lookalike.
var tests = [];
diff --git a/test/simple/test-stream2-readable-legacy-drain.js b/test/simple/test-stream2-readable-legacy-drain.js
index 675da8e..51fd3d5 100644
--- a/test/simple/test-stream2-readable-legacy-drain.js
+++ b/test/simple/test-stream2-readable-legacy-drain.js
@@ -22,7 +22,7 @@
var common = require('../common');
var assert = require('assert');
-var Stream = require('stream');
+var Stream = require('../../');
var Readable = Stream.Readable;
var r = new Readable();
diff --git a/test/simple/test-stream2-readable-non-empty-end.js b/test/simple/test-stream2-readable-non-empty-end.js
index 7314ae7..c971898 100644
--- a/test/simple/test-stream2-readable-non-empty-end.js
+++ b/test/simple/test-stream2-readable-non-empty-end.js
@@ -21,7 +21,7 @@
var assert = require('assert');
var common = require('../common.js');
-var Readable = require('_stream_readable');
+var Readable = require('../../lib/_stream_readable');
var len = 0;
var chunks = new Array(10);
diff --git a/test/simple/test-stream2-readable-wrap-empty.js b/test/simple/test-stream2-readable-wrap-empty.js
index 2e5cf25..fd8a3dc 100644
--- a/test/simple/test-stream2-readable-wrap-empty.js
+++ b/test/simple/test-stream2-readable-wrap-empty.js
@@ -22,7 +22,7 @@
var common = require('../common');
var assert = require('assert');
-var Readable = require('_stream_readable');
+var Readable = require('../../lib/_stream_readable');
var EE = require('events').EventEmitter;
var oldStream = new EE();
diff --git a/test/simple/test-stream2-readable-wrap.js b/test/simple/test-stream2-readable-wrap.js
index 90eea01..6b177f7 100644
--- a/test/simple/test-stream2-readable-wrap.js
+++ b/test/simple/test-stream2-readable-wrap.js
@@ -22,8 +22,8 @@
var common = require('../common');
var assert = require('assert');
-var Readable = require('_stream_readable');
-var Writable = require('_stream_writable');
+var Readable = require('../../lib/_stream_readable');
+var Writable = require('../../lib/_stream_writable');
var EE = require('events').EventEmitter;
var testRuns = 0, completedRuns = 0;
diff --git a/test/simple/test-stream2-set-encoding.js b/test/simple/test-stream2-set-encoding.js
index 5d2c32a..685531b 100644
--- a/test/simple/test-stream2-set-encoding.js
+++ b/test/simple/test-stream2-set-encoding.js
@@ -22,7 +22,7 @@
var common = require('../common.js');
var assert = require('assert');
-var R = require('_stream_readable');
+var R = require('../../lib/_stream_readable');
var util = require('util');
// tiny node-tap lookalike.
diff --git a/test/simple/test-stream2-transform.js b/test/simple/test-stream2-transform.js
index 9c9ddd8..a0cacc6 100644
--- a/test/simple/test-stream2-transform.js
+++ b/test/simple/test-stream2-transform.js
@@ -21,8 +21,8 @@
var assert = require('assert');
var common = require('../common.js');
-var PassThrough = require('_stream_passthrough');
-var Transform = require('_stream_transform');
+var PassThrough = require('../../').PassThrough;
+var Transform = require('../../').Transform;
// tiny node-tap lookalike.
var tests = [];
diff --git a/test/simple/test-stream2-unpipe-drain.js b/test/simple/test-stream2-unpipe-drain.js
index d66dc3c..365b327 100644
--- a/test/simple/test-stream2-unpipe-drain.js
+++ b/test/simple/test-stream2-unpipe-drain.js
@@ -22,7 +22,7 @@
var common = require('../common.js');
var assert = require('assert');
-var stream = require('stream');
+var stream = require('../../');
var crypto = require('crypto');
var util = require('util');
diff --git a/test/simple/test-stream2-unpipe-leak.js b/test/simple/test-stream2-unpipe-leak.js
index 99f8746..17c92ae 100644
--- a/test/simple/test-stream2-unpipe-leak.js
+++ b/test/simple/test-stream2-unpipe-leak.js
@@ -22,7 +22,7 @@
var common = require('../common.js');
var assert = require('assert');
-var stream = require('stream');
+var stream = require('../../');
var chunk = new Buffer('hallo');
diff --git a/test/simple/test-stream2-writable.js b/test/simple/test-stream2-writable.js
index 704100c..209c3a6 100644
--- a/test/simple/test-stream2-writable.js
+++ b/test/simple/test-stream2-writable.js
@@ -20,8 +20,8 @@
// USE OR OTHER DEALINGS IN THE SOFTWARE.
var common = require('../common.js');
-var W = require('_stream_writable');
-var D = require('_stream_duplex');
+var W = require('../../').Writable;
+var D = require('../../').Duplex;
var assert = require('assert');
var util = require('util');
diff --git a/test/simple/test-stream3-pause-then-read.js b/test/simple/test-stream3-pause-then-read.js
index b91bde3..2f72c15 100644
--- a/test/simple/test-stream3-pause-then-read.js
+++ b/test/simple/test-stream3-pause-then-read.js
@@ -22,7 +22,7 @@
var common = require('../common');
var assert = require('assert');
-var stream = require('stream');
+var stream = require('../../');
var Readable = stream.Readable;
var Writable = stream.Writable;

View File

@@ -0,0 +1,89 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// a duplex stream is just a stream that is both readable and writable.
// Since JS doesn't have multiple prototypal inheritance, this class
// prototypally inherits from Readable, and then parasitically from
// Writable.
module.exports = Duplex;
/*<replacement>*/
var objectKeys = Object.keys || function (obj) {
var keys = [];
for (var key in obj) keys.push(key);
return keys;
}
/*</replacement>*/
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
var Readable = require('./_stream_readable');
var Writable = require('./_stream_writable');
util.inherits(Duplex, Readable);
forEach(objectKeys(Writable.prototype), function(method) {
if (!Duplex.prototype[method])
Duplex.prototype[method] = Writable.prototype[method];
});
function Duplex(options) {
if (!(this instanceof Duplex))
return new Duplex(options);
Readable.call(this, options);
Writable.call(this, options);
if (options && options.readable === false)
this.readable = false;
if (options && options.writable === false)
this.writable = false;
this.allowHalfOpen = true;
if (options && options.allowHalfOpen === false)
this.allowHalfOpen = false;
this.once('end', onend);
}
// the no-half-open enforcer
function onend() {
// if we allow half-open state, or if the writable side ended,
// then we're ok.
if (this.allowHalfOpen || this._writableState.ended)
return;
// no more data can be written.
// But allow more writes to happen in this tick.
process.nextTick(this.end.bind(this));
}
function forEach (xs, f) {
for (var i = 0, l = xs.length; i < l; i++) {
f(xs[i], i);
}
}

View File

@@ -0,0 +1,46 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// a passthrough stream.
// basically just the most minimal sort of Transform stream.
// Every written chunk gets output as-is.
module.exports = PassThrough;
var Transform = require('./_stream_transform');
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
util.inherits(PassThrough, Transform);
function PassThrough(options) {
if (!(this instanceof PassThrough))
return new PassThrough(options);
Transform.call(this, options);
}
PassThrough.prototype._transform = function(chunk, encoding, cb) {
cb(null, chunk);
};

View File

@@ -0,0 +1,951 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
module.exports = Readable;
/*<replacement>*/
var isArray = require('isarray');
/*</replacement>*/
/*<replacement>*/
var Buffer = require('buffer').Buffer;
/*</replacement>*/
Readable.ReadableState = ReadableState;
var EE = require('events').EventEmitter;
/*<replacement>*/
if (!EE.listenerCount) EE.listenerCount = function(emitter, type) {
return emitter.listeners(type).length;
};
/*</replacement>*/
var Stream = require('stream');
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
var StringDecoder;
/*<replacement>*/
var debug = require('util');
if (debug && debug.debuglog) {
debug = debug.debuglog('stream');
} else {
debug = function () {};
}
/*</replacement>*/
util.inherits(Readable, Stream);
function ReadableState(options, stream) {
var Duplex = require('./_stream_duplex');
options = options || {};
// the point at which it stops calling _read() to fill the buffer
// Note: 0 is a valid value, means "don't call _read preemptively ever"
var hwm = options.highWaterMark;
var defaultHwm = options.objectMode ? 16 : 16 * 1024;
this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm;
// cast to ints.
this.highWaterMark = ~~this.highWaterMark;
this.buffer = [];
this.length = 0;
this.pipes = null;
this.pipesCount = 0;
this.flowing = null;
this.ended = false;
this.endEmitted = false;
this.reading = false;
// a flag to be able to tell if the onwrite cb is called immediately,
// or on a later tick. We set this to true at first, because any
// actions that shouldn't happen until "later" should generally also
// not happen before the first write call.
this.sync = true;
// whenever we return null, then we set a flag to say
// that we're awaiting a 'readable' event emission.
this.needReadable = false;
this.emittedReadable = false;
this.readableListening = false;
// object stream flag. Used to make read(n) ignore n and to
// make all the buffer merging and length checks go away
this.objectMode = !!options.objectMode;
if (stream instanceof Duplex)
this.objectMode = this.objectMode || !!options.readableObjectMode;
// Crypto is kind of old and crusty. Historically, its default string
// encoding is 'binary' so we have to make this configurable.
// Everything else in the universe uses 'utf8', though.
this.defaultEncoding = options.defaultEncoding || 'utf8';
// when piping, we only care about 'readable' events that happen
// after read()ing all the bytes and not getting any pushback.
this.ranOut = false;
// the number of writers that are awaiting a drain event in .pipe()s
this.awaitDrain = 0;
// if true, a maybeReadMore has been scheduled
this.readingMore = false;
this.decoder = null;
this.encoding = null;
if (options.encoding) {
if (!StringDecoder)
StringDecoder = require('string_decoder/').StringDecoder;
this.decoder = new StringDecoder(options.encoding);
this.encoding = options.encoding;
}
}
function Readable(options) {
var Duplex = require('./_stream_duplex');
if (!(this instanceof Readable))
return new Readable(options);
this._readableState = new ReadableState(options, this);
// legacy
this.readable = true;
Stream.call(this);
}
// Manually shove something into the read() buffer.
// This returns true if the highWaterMark has not been hit yet,
// similar to how Writable.write() returns true if you should
// write() some more.
Readable.prototype.push = function(chunk, encoding) {
var state = this._readableState;
if (util.isString(chunk) && !state.objectMode) {
encoding = encoding || state.defaultEncoding;
if (encoding !== state.encoding) {
chunk = new Buffer(chunk, encoding);
encoding = '';
}
}
return readableAddChunk(this, state, chunk, encoding, false);
};
// Unshift should *always* be something directly out of read()
Readable.prototype.unshift = function(chunk) {
var state = this._readableState;
return readableAddChunk(this, state, chunk, '', true);
};
function readableAddChunk(stream, state, chunk, encoding, addToFront) {
var er = chunkInvalid(state, chunk);
if (er) {
stream.emit('error', er);
} else if (util.isNullOrUndefined(chunk)) {
state.reading = false;
if (!state.ended)
onEofChunk(stream, state);
} else if (state.objectMode || chunk && chunk.length > 0) {
if (state.ended && !addToFront) {
var e = new Error('stream.push() after EOF');
stream.emit('error', e);
} else if (state.endEmitted && addToFront) {
var e = new Error('stream.unshift() after end event');
stream.emit('error', e);
} else {
if (state.decoder && !addToFront && !encoding)
chunk = state.decoder.write(chunk);
if (!addToFront)
state.reading = false;
// if we want the data now, just emit it.
if (state.flowing && state.length === 0 && !state.sync) {
stream.emit('data', chunk);
stream.read(0);
} else {
// update the buffer info.
state.length += state.objectMode ? 1 : chunk.length;
if (addToFront)
state.buffer.unshift(chunk);
else
state.buffer.push(chunk);
if (state.needReadable)
emitReadable(stream);
}
maybeReadMore(stream, state);
}
} else if (!addToFront) {
state.reading = false;
}
return needMoreData(state);
}
// if it's past the high water mark, we can push in some more.
// Also, if we have no data yet, we can stand some
// more bytes. This is to work around cases where hwm=0,
// such as the repl. Also, if the push() triggered a
// readable event, and the user called read(largeNumber) such that
// needReadable was set, then we ought to push more, so that another
// 'readable' event will be triggered.
function needMoreData(state) {
return !state.ended &&
(state.needReadable ||
state.length < state.highWaterMark ||
state.length === 0);
}
// backwards compatibility.
Readable.prototype.setEncoding = function(enc) {
if (!StringDecoder)
StringDecoder = require('string_decoder/').StringDecoder;
this._readableState.decoder = new StringDecoder(enc);
this._readableState.encoding = enc;
return this;
};
// Don't raise the hwm > 128MB
var MAX_HWM = 0x800000;
function roundUpToNextPowerOf2(n) {
if (n >= MAX_HWM) {
n = MAX_HWM;
} else {
// Get the next highest power of 2
n--;
for (var p = 1; p < 32; p <<= 1) n |= n >> p;
n++;
}
return n;
}
function howMuchToRead(n, state) {
if (state.length === 0 && state.ended)
return 0;
if (state.objectMode)
return n === 0 ? 0 : 1;
if (isNaN(n) || util.isNull(n)) {
// only flow one buffer at a time
if (state.flowing && state.buffer.length)
return state.buffer[0].length;
else
return state.length;
}
if (n <= 0)
return 0;
// If we're asking for more than the target buffer level,
// then raise the water mark. Bump up to the next highest
// power of 2, to prevent increasing it excessively in tiny
// amounts.
if (n > state.highWaterMark)
state.highWaterMark = roundUpToNextPowerOf2(n);
// don't have that much. return null, unless we've ended.
if (n > state.length) {
if (!state.ended) {
state.needReadable = true;
return 0;
} else
return state.length;
}
return n;
}
// you can override either this method, or the async _read(n) below.
Readable.prototype.read = function(n) {
debug('read', n);
var state = this._readableState;
var nOrig = n;
if (!util.isNumber(n) || n > 0)
state.emittedReadable = false;
// if we're doing read(0) to trigger a readable event, but we
// already have a bunch of data in the buffer, then just trigger
// the 'readable' event and move on.
if (n === 0 &&
state.needReadable &&
(state.length >= state.highWaterMark || state.ended)) {
debug('read: emitReadable', state.length, state.ended);
if (state.length === 0 && state.ended)
endReadable(this);
else
emitReadable(this);
return null;
}
n = howMuchToRead(n, state);
// if we've ended, and we're now clear, then finish it up.
if (n === 0 && state.ended) {
if (state.length === 0)
endReadable(this);
return null;
}
// All the actual chunk generation logic needs to be
// *below* the call to _read. The reason is that in certain
// synthetic stream cases, such as passthrough streams, _read
// may be a completely synchronous operation which may change
// the state of the read buffer, providing enough data when
// before there was *not* enough.
//
// So, the steps are:
// 1. Figure out what the state of things will be after we do
// a read from the buffer.
//
// 2. If that resulting state will trigger a _read, then call _read.
// Note that this may be asynchronous, or synchronous. Yes, it is
// deeply ugly to write APIs this way, but that still doesn't mean
// that the Readable class should behave improperly, as streams are
// designed to be sync/async agnostic.
// Take note if the _read call is sync or async (ie, if the read call
// has returned yet), so that we know whether or not it's safe to emit
// 'readable' etc.
//
// 3. Actually pull the requested chunks out of the buffer and return.
// if we need a readable event, then we need to do some reading.
var doRead = state.needReadable;
debug('need readable', doRead);
// if we currently have less than the highWaterMark, then also read some
if (state.length === 0 || state.length - n < state.highWaterMark) {
doRead = true;
debug('length less than watermark', doRead);
}
// however, if we've ended, then there's no point, and if we're already
// reading, then it's unnecessary.
if (state.ended || state.reading) {
doRead = false;
debug('reading or ended', doRead);
}
if (doRead) {
debug('do read');
state.reading = true;
state.sync = true;
// if the length is currently zero, then we *need* a readable event.
if (state.length === 0)
state.needReadable = true;
// call internal read method
this._read(state.highWaterMark);
state.sync = false;
}
// If _read pushed data synchronously, then `reading` will be false,
// and we need to re-evaluate how much data we can return to the user.
if (doRead && !state.reading)
n = howMuchToRead(nOrig, state);
var ret;
if (n > 0)
ret = fromList(n, state);
else
ret = null;
if (util.isNull(ret)) {
state.needReadable = true;
n = 0;
}
state.length -= n;
// If we have nothing in the buffer, then we want to know
// as soon as we *do* get something into the buffer.
if (state.length === 0 && !state.ended)
state.needReadable = true;
// If we tried to read() past the EOF, then emit end on the next tick.
if (nOrig !== n && state.ended && state.length === 0)
endReadable(this);
if (!util.isNull(ret))
this.emit('data', ret);
return ret;
};
function chunkInvalid(state, chunk) {
var er = null;
if (!util.isBuffer(chunk) &&
!util.isString(chunk) &&
!util.isNullOrUndefined(chunk) &&
!state.objectMode) {
er = new TypeError('Invalid non-string/buffer chunk');
}
return er;
}
function onEofChunk(stream, state) {
if (state.decoder && !state.ended) {
var chunk = state.decoder.end();
if (chunk && chunk.length) {
state.buffer.push(chunk);
state.length += state.objectMode ? 1 : chunk.length;
}
}
state.ended = true;
// emit 'readable' now to make sure it gets picked up.
emitReadable(stream);
}
// Don't emit readable right away in sync mode, because this can trigger
// another read() call => stack overflow. This way, it might trigger
// a nextTick recursion warning, but that's not so bad.
function emitReadable(stream) {
var state = stream._readableState;
state.needReadable = false;
if (!state.emittedReadable) {
debug('emitReadable', state.flowing);
state.emittedReadable = true;
if (state.sync)
process.nextTick(function() {
emitReadable_(stream);
});
else
emitReadable_(stream);
}
}
function emitReadable_(stream) {
debug('emit readable');
stream.emit('readable');
flow(stream);
}
// at this point, the user has presumably seen the 'readable' event,
// and called read() to consume some data. that may have triggered
// in turn another _read(n) call, in which case reading = true if
// it's in progress.
// However, if we're not ended, or reading, and the length < hwm,
// then go ahead and try to read some more preemptively.
function maybeReadMore(stream, state) {
if (!state.readingMore) {
state.readingMore = true;
process.nextTick(function() {
maybeReadMore_(stream, state);
});
}
}
function maybeReadMore_(stream, state) {
var len = state.length;
while (!state.reading && !state.flowing && !state.ended &&
state.length < state.highWaterMark) {
debug('maybeReadMore read 0');
stream.read(0);
if (len === state.length)
// didn't get any data, stop spinning.
break;
else
len = state.length;
}
state.readingMore = false;
}
// abstract method. to be overridden in specific implementation classes.
// call cb(er, data) where data is <= n in length.
// for virtual (non-string, non-buffer) streams, "length" is somewhat
// arbitrary, and perhaps not very meaningful.
Readable.prototype._read = function(n) {
this.emit('error', new Error('not implemented'));
};
Readable.prototype.pipe = function(dest, pipeOpts) {
var src = this;
var state = this._readableState;
switch (state.pipesCount) {
case 0:
state.pipes = dest;
break;
case 1:
state.pipes = [state.pipes, dest];
break;
default:
state.pipes.push(dest);
break;
}
state.pipesCount += 1;
debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts);
var doEnd = (!pipeOpts || pipeOpts.end !== false) &&
dest !== process.stdout &&
dest !== process.stderr;
var endFn = doEnd ? onend : cleanup;
if (state.endEmitted)
process.nextTick(endFn);
else
src.once('end', endFn);
dest.on('unpipe', onunpipe);
function onunpipe(readable) {
debug('onunpipe');
if (readable === src) {
cleanup();
}
}
function onend() {
debug('onend');
dest.end();
}
// when the dest drains, it reduces the awaitDrain counter
// on the source. This would be more elegant with a .once()
// handler in flow(), but adding and removing repeatedly is
// too slow.
var ondrain = pipeOnDrain(src);
dest.on('drain', ondrain);
function cleanup() {
debug('cleanup');
// cleanup event handlers once the pipe is broken
dest.removeListener('close', onclose);
dest.removeListener('finish', onfinish);
dest.removeListener('drain', ondrain);
dest.removeListener('error', onerror);
dest.removeListener('unpipe', onunpipe);
src.removeListener('end', onend);
src.removeListener('end', cleanup);
src.removeListener('data', ondata);
// if the reader is waiting for a drain event from this
// specific writer, then it would cause it to never start
// flowing again.
// So, if this is awaiting a drain, then we just call it now.
// If we don't know, then assume that we are waiting for one.
if (state.awaitDrain &&
(!dest._writableState || dest._writableState.needDrain))
ondrain();
}
src.on('data', ondata);
function ondata(chunk) {
debug('ondata');
var ret = dest.write(chunk);
if (false === ret) {
debug('false write response, pause',
src._readableState.awaitDrain);
src._readableState.awaitDrain++;
src.pause();
}
}
// if the dest has an error, then stop piping into it.
// however, don't suppress the throwing behavior for this.
function onerror(er) {
debug('onerror', er);
unpipe();
dest.removeListener('error', onerror);
if (EE.listenerCount(dest, 'error') === 0)
dest.emit('error', er);
}
// This is a brutally ugly hack to make sure that our error handler
// is attached before any userland ones. NEVER DO THIS.
if (!dest._events || !dest._events.error)
dest.on('error', onerror);
else if (isArray(dest._events.error))
dest._events.error.unshift(onerror);
else
dest._events.error = [onerror, dest._events.error];
// Both close and finish should trigger unpipe, but only once.
function onclose() {
dest.removeListener('finish', onfinish);
unpipe();
}
dest.once('close', onclose);
function onfinish() {
debug('onfinish');
dest.removeListener('close', onclose);
unpipe();
}
dest.once('finish', onfinish);
function unpipe() {
debug('unpipe');
src.unpipe(dest);
}
// tell the dest that it's being piped to
dest.emit('pipe', src);
// start the flow if it hasn't been started already.
if (!state.flowing) {
debug('pipe resume');
src.resume();
}
return dest;
};
function pipeOnDrain(src) {
return function() {
var state = src._readableState;
debug('pipeOnDrain', state.awaitDrain);
if (state.awaitDrain)
state.awaitDrain--;
if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) {
state.flowing = true;
flow(src);
}
};
}
Readable.prototype.unpipe = function(dest) {
var state = this._readableState;
// if we're not piping anywhere, then do nothing.
if (state.pipesCount === 0)
return this;
// just one destination. most common case.
if (state.pipesCount === 1) {
// passed in one, but it's not the right one.
if (dest && dest !== state.pipes)
return this;
if (!dest)
dest = state.pipes;
// got a match.
state.pipes = null;
state.pipesCount = 0;
state.flowing = false;
if (dest)
dest.emit('unpipe', this);
return this;
}
// slow case. multiple pipe destinations.
if (!dest) {
// remove all.
var dests = state.pipes;
var len = state.pipesCount;
state.pipes = null;
state.pipesCount = 0;
state.flowing = false;
for (var i = 0; i < len; i++)
dests[i].emit('unpipe', this);
return this;
}
// try to find the right one.
var i = indexOf(state.pipes, dest);
if (i === -1)
return this;
state.pipes.splice(i, 1);
state.pipesCount -= 1;
if (state.pipesCount === 1)
state.pipes = state.pipes[0];
dest.emit('unpipe', this);
return this;
};
// set up data events if they are asked for
// Ensure readable listeners eventually get something
Readable.prototype.on = function(ev, fn) {
var res = Stream.prototype.on.call(this, ev, fn);
// If listening to data, and it has not explicitly been paused,
// then call resume to start the flow of data on the next tick.
if (ev === 'data' && false !== this._readableState.flowing) {
this.resume();
}
if (ev === 'readable' && this.readable) {
var state = this._readableState;
if (!state.readableListening) {
state.readableListening = true;
state.emittedReadable = false;
state.needReadable = true;
if (!state.reading) {
var self = this;
process.nextTick(function() {
debug('readable nexttick read 0');
self.read(0);
});
} else if (state.length) {
emitReadable(this, state);
}
}
}
return res;
};
Readable.prototype.addListener = Readable.prototype.on;
// pause() and resume() are remnants of the legacy readable stream API
// If the user uses them, then switch into old mode.
Readable.prototype.resume = function() {
var state = this._readableState;
if (!state.flowing) {
debug('resume');
state.flowing = true;
if (!state.reading) {
debug('resume read 0');
this.read(0);
}
resume(this, state);
}
return this;
};
function resume(stream, state) {
if (!state.resumeScheduled) {
state.resumeScheduled = true;
process.nextTick(function() {
resume_(stream, state);
});
}
}
function resume_(stream, state) {
state.resumeScheduled = false;
stream.emit('resume');
flow(stream);
if (state.flowing && !state.reading)
stream.read(0);
}
Readable.prototype.pause = function() {
debug('call pause flowing=%j', this._readableState.flowing);
if (false !== this._readableState.flowing) {
debug('pause');
this._readableState.flowing = false;
this.emit('pause');
}
return this;
};
function flow(stream) {
var state = stream._readableState;
debug('flow', state.flowing);
if (state.flowing) {
do {
var chunk = stream.read();
} while (null !== chunk && state.flowing);
}
}
// wrap an old-style stream as the async data source.
// This is *not* part of the readable stream interface.
// It is an ugly unfortunate mess of history.
Readable.prototype.wrap = function(stream) {
var state = this._readableState;
var paused = false;
var self = this;
stream.on('end', function() {
debug('wrapped end');
if (state.decoder && !state.ended) {
var chunk = state.decoder.end();
if (chunk && chunk.length)
self.push(chunk);
}
self.push(null);
});
stream.on('data', function(chunk) {
debug('wrapped data');
if (state.decoder)
chunk = state.decoder.write(chunk);
if (!chunk || !state.objectMode && !chunk.length)
return;
var ret = self.push(chunk);
if (!ret) {
paused = true;
stream.pause();
}
});
// proxy all the other methods.
// important when wrapping filters and duplexes.
for (var i in stream) {
if (util.isFunction(stream[i]) && util.isUndefined(this[i])) {
this[i] = function(method) { return function() {
return stream[method].apply(stream, arguments);
}}(i);
}
}
// proxy certain important events.
var events = ['error', 'close', 'destroy', 'pause', 'resume'];
forEach(events, function(ev) {
stream.on(ev, self.emit.bind(self, ev));
});
// when we try to consume some more bytes, simply unpause the
// underlying stream.
self._read = function(n) {
debug('wrapped _read', n);
if (paused) {
paused = false;
stream.resume();
}
};
return self;
};
// exposed for testing purposes only.
Readable._fromList = fromList;
// Pluck off n bytes from an array of buffers.
// Length is the combined lengths of all the buffers in the list.
function fromList(n, state) {
var list = state.buffer;
var length = state.length;
var stringMode = !!state.decoder;
var objectMode = !!state.objectMode;
var ret;
// nothing in the list, definitely empty.
if (list.length === 0)
return null;
if (length === 0)
ret = null;
else if (objectMode)
ret = list.shift();
else if (!n || n >= length) {
// read it all, truncate the array.
if (stringMode)
ret = list.join('');
else
ret = Buffer.concat(list, length);
list.length = 0;
} else {
// read just some of it.
if (n < list[0].length) {
// just take a part of the first list item.
// slice is the same for buffers and strings.
var buf = list[0];
ret = buf.slice(0, n);
list[0] = buf.slice(n);
} else if (n === list[0].length) {
// first list is a perfect match
ret = list.shift();
} else {
// complex case.
// we have enough to cover it, but it spans past the first buffer.
if (stringMode)
ret = '';
else
ret = new Buffer(n);
var c = 0;
for (var i = 0, l = list.length; i < l && c < n; i++) {
var buf = list[0];
var cpy = Math.min(n - c, buf.length);
if (stringMode)
ret += buf.slice(0, cpy);
else
buf.copy(ret, c, 0, cpy);
if (cpy < buf.length)
list[0] = buf.slice(cpy);
else
list.shift();
c += cpy;
}
}
}
return ret;
}
function endReadable(stream) {
var state = stream._readableState;
// If we get here before consuming all the bytes, then that is a
// bug in node. Should never happen.
if (state.length > 0)
throw new Error('endReadable called on non-empty stream');
if (!state.endEmitted) {
state.ended = true;
process.nextTick(function() {
// Check that we didn't get one last unshift.
if (!state.endEmitted && state.length === 0) {
state.endEmitted = true;
stream.readable = false;
stream.emit('end');
}
});
}
}
function forEach (xs, f) {
for (var i = 0, l = xs.length; i < l; i++) {
f(xs[i], i);
}
}
function indexOf (xs, x) {
for (var i = 0, l = xs.length; i < l; i++) {
if (xs[i] === x) return i;
}
return -1;
}

View File

@@ -0,0 +1,209 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// a transform stream is a readable/writable stream where you do
// something with the data. Sometimes it's called a "filter",
// but that's not a great name for it, since that implies a thing where
// some bits pass through, and others are simply ignored. (That would
// be a valid example of a transform, of course.)
//
// While the output is causally related to the input, it's not a
// necessarily symmetric or synchronous transformation. For example,
// a zlib stream might take multiple plain-text writes(), and then
// emit a single compressed chunk some time in the future.
//
// Here's how this works:
//
// The Transform stream has all the aspects of the readable and writable
// stream classes. When you write(chunk), that calls _write(chunk,cb)
// internally, and returns false if there's a lot of pending writes
// buffered up. When you call read(), that calls _read(n) until
// there's enough pending readable data buffered up.
//
// In a transform stream, the written data is placed in a buffer. When
// _read(n) is called, it transforms the queued up data, calling the
// buffered _write cb's as it consumes chunks. If consuming a single
// written chunk would result in multiple output chunks, then the first
// outputted bit calls the readcb, and subsequent chunks just go into
// the read buffer, and will cause it to emit 'readable' if necessary.
//
// This way, back-pressure is actually determined by the reading side,
// since _read has to be called to start processing a new chunk. However,
// a pathological inflate type of transform can cause excessive buffering
// here. For example, imagine a stream where every byte of input is
// interpreted as an integer from 0-255, and then results in that many
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
// 1kb of data being output. In this case, you could write a very small
// amount of input, and end up with a very large amount of output. In
// such a pathological inflating mechanism, there'd be no way to tell
// the system to stop doing the transform. A single 4MB write could
// cause the system to run out of memory.
//
// However, even in such a pathological case, only a single written chunk
// would be consumed, and then the rest would wait (un-transformed) until
// the results of the previous transformed chunk were consumed.
module.exports = Transform;
var Duplex = require('./_stream_duplex');
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
util.inherits(Transform, Duplex);
function TransformState(options, stream) {
this.afterTransform = function(er, data) {
return afterTransform(stream, er, data);
};
this.needTransform = false;
this.transforming = false;
this.writecb = null;
this.writechunk = null;
}
function afterTransform(stream, er, data) {
var ts = stream._transformState;
ts.transforming = false;
var cb = ts.writecb;
if (!cb)
return stream.emit('error', new Error('no writecb in Transform class'));
ts.writechunk = null;
ts.writecb = null;
if (!util.isNullOrUndefined(data))
stream.push(data);
if (cb)
cb(er);
var rs = stream._readableState;
rs.reading = false;
if (rs.needReadable || rs.length < rs.highWaterMark) {
stream._read(rs.highWaterMark);
}
}
function Transform(options) {
if (!(this instanceof Transform))
return new Transform(options);
Duplex.call(this, options);
this._transformState = new TransformState(options, this);
// when the writable side finishes, then flush out anything remaining.
var stream = this;
// start out asking for a readable event once data is transformed.
this._readableState.needReadable = true;
// we have implemented the _read method, and done the other things
// that Readable wants before the first _read call, so unset the
// sync guard flag.
this._readableState.sync = false;
this.once('prefinish', function() {
if (util.isFunction(this._flush))
this._flush(function(er) {
done(stream, er);
});
else
done(stream);
});
}
Transform.prototype.push = function(chunk, encoding) {
this._transformState.needTransform = false;
return Duplex.prototype.push.call(this, chunk, encoding);
};
// This is the part where you do stuff!
// override this function in implementation classes.
// 'chunk' is an input chunk.
//
// Call `push(newChunk)` to pass along transformed output
// to the readable side. You may call 'push' zero or more times.
//
// Call `cb(err)` when you are done with this chunk. If you pass
// an error, then that'll put the hurt on the whole operation. If you
// never call cb(), then you'll never get another chunk.
Transform.prototype._transform = function(chunk, encoding, cb) {
throw new Error('not implemented');
};
Transform.prototype._write = function(chunk, encoding, cb) {
var ts = this._transformState;
ts.writecb = cb;
ts.writechunk = chunk;
ts.writeencoding = encoding;
if (!ts.transforming) {
var rs = this._readableState;
if (ts.needTransform ||
rs.needReadable ||
rs.length < rs.highWaterMark)
this._read(rs.highWaterMark);
}
};
// Doesn't matter what the args are here.
// _transform does all the work.
// That we got here means that the readable side wants more data.
Transform.prototype._read = function(n) {
var ts = this._transformState;
if (!util.isNull(ts.writechunk) && ts.writecb && !ts.transforming) {
ts.transforming = true;
this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform);
} else {
// mark that we need a transform, so that any data that comes in
// will get processed, now that we've asked for it.
ts.needTransform = true;
}
};
function done(stream, er) {
if (er)
return stream.emit('error', er);
// if there's nothing in the write buffer, then that means
// that nothing more will ever be provided
var ws = stream._writableState;
var ts = stream._transformState;
if (ws.length)
throw new Error('calling transform done when ws.length != 0');
if (ts.transforming)
throw new Error('calling transform done when still transforming');
return stream.push(null);
}

View File

@@ -0,0 +1,477 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// A bit simpler than readable streams.
// Implement an async ._write(chunk, cb), and it'll handle all
// the drain event emission and buffering.
module.exports = Writable;
/*<replacement>*/
var Buffer = require('buffer').Buffer;
/*</replacement>*/
Writable.WritableState = WritableState;
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
var Stream = require('stream');
util.inherits(Writable, Stream);
function WriteReq(chunk, encoding, cb) {
this.chunk = chunk;
this.encoding = encoding;
this.callback = cb;
}
function WritableState(options, stream) {
var Duplex = require('./_stream_duplex');
options = options || {};
// the point at which write() starts returning false
// Note: 0 is a valid value, means that we always return false if
// the entire buffer is not flushed immediately on write()
var hwm = options.highWaterMark;
var defaultHwm = options.objectMode ? 16 : 16 * 1024;
this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm;
// object stream flag to indicate whether or not this stream
// contains buffers or objects.
this.objectMode = !!options.objectMode;
if (stream instanceof Duplex)
this.objectMode = this.objectMode || !!options.writableObjectMode;
// cast to ints.
this.highWaterMark = ~~this.highWaterMark;
this.needDrain = false;
// at the start of calling end()
this.ending = false;
// when end() has been called, and returned
this.ended = false;
// when 'finish' is emitted
this.finished = false;
// should we decode strings into buffers before passing to _write?
// this is here so that some node-core streams can optimize string
// handling at a lower level.
var noDecode = options.decodeStrings === false;
this.decodeStrings = !noDecode;
// Crypto is kind of old and crusty. Historically, its default string
// encoding is 'binary' so we have to make this configurable.
// Everything else in the universe uses 'utf8', though.
this.defaultEncoding = options.defaultEncoding || 'utf8';
// not an actual buffer we keep track of, but a measurement
// of how much we're waiting to get pushed to some underlying
// socket or file.
this.length = 0;
// a flag to see when we're in the middle of a write.
this.writing = false;
// when true all writes will be buffered until .uncork() call
this.corked = 0;
// a flag to be able to tell if the onwrite cb is called immediately,
// or on a later tick. We set this to true at first, because any
// actions that shouldn't happen until "later" should generally also
// not happen before the first write call.
this.sync = true;
// a flag to know if we're processing previously buffered items, which
// may call the _write() callback in the same tick, so that we don't
// end up in an overlapped onwrite situation.
this.bufferProcessing = false;
// the callback that's passed to _write(chunk,cb)
this.onwrite = function(er) {
onwrite(stream, er);
};
// the callback that the user supplies to write(chunk,encoding,cb)
this.writecb = null;
// the amount that is being written when _write is called.
this.writelen = 0;
this.buffer = [];
// number of pending user-supplied write callbacks
// this must be 0 before 'finish' can be emitted
this.pendingcb = 0;
// emit prefinish if the only thing we're waiting for is _write cbs
// This is relevant for synchronous Transform streams
this.prefinished = false;
// True if the error was already emitted and should not be thrown again
this.errorEmitted = false;
}
function Writable(options) {
var Duplex = require('./_stream_duplex');
// Writable ctor is applied to Duplexes, though they're not
// instanceof Writable, they're instanceof Readable.
if (!(this instanceof Writable) && !(this instanceof Duplex))
return new Writable(options);
this._writableState = new WritableState(options, this);
// legacy.
this.writable = true;
Stream.call(this);
}
// Otherwise people can pipe Writable streams, which is just wrong.
Writable.prototype.pipe = function() {
this.emit('error', new Error('Cannot pipe. Not readable.'));
};
function writeAfterEnd(stream, state, cb) {
var er = new Error('write after end');
// TODO: defer error events consistently everywhere, not just the cb
stream.emit('error', er);
process.nextTick(function() {
cb(er);
});
}
// If we get something that is not a buffer, string, null, or undefined,
// and we're not in objectMode, then that's an error.
// Otherwise stream chunks are all considered to be of length=1, and the
// watermarks determine how many objects to keep in the buffer, rather than
// how many bytes or characters.
function validChunk(stream, state, chunk, cb) {
var valid = true;
if (!util.isBuffer(chunk) &&
!util.isString(chunk) &&
!util.isNullOrUndefined(chunk) &&
!state.objectMode) {
var er = new TypeError('Invalid non-string/buffer chunk');
stream.emit('error', er);
process.nextTick(function() {
cb(er);
});
valid = false;
}
return valid;
}
Writable.prototype.write = function(chunk, encoding, cb) {
var state = this._writableState;
var ret = false;
if (util.isFunction(encoding)) {
cb = encoding;
encoding = null;
}
if (util.isBuffer(chunk))
encoding = 'buffer';
else if (!encoding)
encoding = state.defaultEncoding;
if (!util.isFunction(cb))
cb = function() {};
if (state.ended)
writeAfterEnd(this, state, cb);
else if (validChunk(this, state, chunk, cb)) {
state.pendingcb++;
ret = writeOrBuffer(this, state, chunk, encoding, cb);
}
return ret;
};
Writable.prototype.cork = function() {
var state = this._writableState;
state.corked++;
};
Writable.prototype.uncork = function() {
var state = this._writableState;
if (state.corked) {
state.corked--;
if (!state.writing &&
!state.corked &&
!state.finished &&
!state.bufferProcessing &&
state.buffer.length)
clearBuffer(this, state);
}
};
function decodeChunk(state, chunk, encoding) {
if (!state.objectMode &&
state.decodeStrings !== false &&
util.isString(chunk)) {
chunk = new Buffer(chunk, encoding);
}
return chunk;
}
// if we're already writing something, then just put this
// in the queue, and wait our turn. Otherwise, call _write
// If we return false, then we need a drain event, so set that flag.
function writeOrBuffer(stream, state, chunk, encoding, cb) {
chunk = decodeChunk(state, chunk, encoding);
if (util.isBuffer(chunk))
encoding = 'buffer';
var len = state.objectMode ? 1 : chunk.length;
state.length += len;
var ret = state.length < state.highWaterMark;
// we must ensure that previous needDrain will not be reset to false.
if (!ret)
state.needDrain = true;
if (state.writing || state.corked)
state.buffer.push(new WriteReq(chunk, encoding, cb));
else
doWrite(stream, state, false, len, chunk, encoding, cb);
return ret;
}
function doWrite(stream, state, writev, len, chunk, encoding, cb) {
state.writelen = len;
state.writecb = cb;
state.writing = true;
state.sync = true;
if (writev)
stream._writev(chunk, state.onwrite);
else
stream._write(chunk, encoding, state.onwrite);
state.sync = false;
}
function onwriteError(stream, state, sync, er, cb) {
if (sync)
process.nextTick(function() {
state.pendingcb--;
cb(er);
});
else {
state.pendingcb--;
cb(er);
}
stream._writableState.errorEmitted = true;
stream.emit('error', er);
}
function onwriteStateUpdate(state) {
state.writing = false;
state.writecb = null;
state.length -= state.writelen;
state.writelen = 0;
}
function onwrite(stream, er) {
var state = stream._writableState;
var sync = state.sync;
var cb = state.writecb;
onwriteStateUpdate(state);
if (er)
onwriteError(stream, state, sync, er, cb);
else {
// Check if we're actually ready to finish, but don't emit yet
var finished = needFinish(stream, state);
if (!finished &&
!state.corked &&
!state.bufferProcessing &&
state.buffer.length) {
clearBuffer(stream, state);
}
if (sync) {
process.nextTick(function() {
afterWrite(stream, state, finished, cb);
});
} else {
afterWrite(stream, state, finished, cb);
}
}
}
function afterWrite(stream, state, finished, cb) {
if (!finished)
onwriteDrain(stream, state);
state.pendingcb--;
cb();
finishMaybe(stream, state);
}
// Must force callback to be called on nextTick, so that we don't
// emit 'drain' before the write() consumer gets the 'false' return
// value, and has a chance to attach a 'drain' listener.
function onwriteDrain(stream, state) {
if (state.length === 0 && state.needDrain) {
state.needDrain = false;
stream.emit('drain');
}
}
// if there's something in the buffer waiting, then process it
function clearBuffer(stream, state) {
state.bufferProcessing = true;
if (stream._writev && state.buffer.length > 1) {
// Fast case, write everything using _writev()
var cbs = [];
for (var c = 0; c < state.buffer.length; c++)
cbs.push(state.buffer[c].callback);
// count the one we are adding, as well.
// TODO(isaacs) clean this up
state.pendingcb++;
doWrite(stream, state, true, state.length, state.buffer, '', function(err) {
for (var i = 0; i < cbs.length; i++) {
state.pendingcb--;
cbs[i](err);
}
});
// Clear buffer
state.buffer = [];
} else {
// Slow case, write chunks one-by-one
for (var c = 0; c < state.buffer.length; c++) {
var entry = state.buffer[c];
var chunk = entry.chunk;
var encoding = entry.encoding;
var cb = entry.callback;
var len = state.objectMode ? 1 : chunk.length;
doWrite(stream, state, false, len, chunk, encoding, cb);
// if we didn't call the onwrite immediately, then
// it means that we need to wait until it does.
// also, that means that the chunk and cb are currently
// being processed, so move the buffer counter past them.
if (state.writing) {
c++;
break;
}
}
if (c < state.buffer.length)
state.buffer = state.buffer.slice(c);
else
state.buffer.length = 0;
}
state.bufferProcessing = false;
}
Writable.prototype._write = function(chunk, encoding, cb) {
cb(new Error('not implemented'));
};
Writable.prototype._writev = null;
Writable.prototype.end = function(chunk, encoding, cb) {
var state = this._writableState;
if (util.isFunction(chunk)) {
cb = chunk;
chunk = null;
encoding = null;
} else if (util.isFunction(encoding)) {
cb = encoding;
encoding = null;
}
if (!util.isNullOrUndefined(chunk))
this.write(chunk, encoding);
// .end() fully uncorks
if (state.corked) {
state.corked = 1;
this.uncork();
}
// ignore unnecessary end() calls.
if (!state.ending && !state.finished)
endWritable(this, state, cb);
};
function needFinish(stream, state) {
return (state.ending &&
state.length === 0 &&
!state.finished &&
!state.writing);
}
function prefinish(stream, state) {
if (!state.prefinished) {
state.prefinished = true;
stream.emit('prefinish');
}
}
function finishMaybe(stream, state) {
var need = needFinish(stream, state);
if (need) {
if (state.pendingcb === 0) {
prefinish(stream, state);
state.finished = true;
stream.emit('finish');
} else
prefinish(stream, state);
}
return need;
}
function endWritable(stream, state, cb) {
state.ending = true;
finishMaybe(stream, state);
if (cb) {
if (state.finished)
process.nextTick(cb);
else
stream.once('finish', cb);
}
state.ended = true;
}

View File

@@ -0,0 +1,3 @@
# core-util-is
The `util.is*` functions introduced in Node v0.12.

View File

@@ -0,0 +1,604 @@
diff --git a/lib/util.js b/lib/util.js
index a03e874..9074e8e 100644
--- a/lib/util.js
+++ b/lib/util.js
@@ -19,430 +19,6 @@
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
-var formatRegExp = /%[sdj%]/g;
-exports.format = function(f) {
- if (!isString(f)) {
- var objects = [];
- for (var i = 0; i < arguments.length; i++) {
- objects.push(inspect(arguments[i]));
- }
- return objects.join(' ');
- }
-
- var i = 1;
- var args = arguments;
- var len = args.length;
- var str = String(f).replace(formatRegExp, function(x) {
- if (x === '%%') return '%';
- if (i >= len) return x;
- switch (x) {
- case '%s': return String(args[i++]);
- case '%d': return Number(args[i++]);
- case '%j':
- try {
- return JSON.stringify(args[i++]);
- } catch (_) {
- return '[Circular]';
- }
- default:
- return x;
- }
- });
- for (var x = args[i]; i < len; x = args[++i]) {
- if (isNull(x) || !isObject(x)) {
- str += ' ' + x;
- } else {
- str += ' ' + inspect(x);
- }
- }
- return str;
-};
-
-
-// Mark that a method should not be used.
-// Returns a modified function which warns once by default.
-// If --no-deprecation is set, then it is a no-op.
-exports.deprecate = function(fn, msg) {
- // Allow for deprecating things in the process of starting up.
- if (isUndefined(global.process)) {
- return function() {
- return exports.deprecate(fn, msg).apply(this, arguments);
- };
- }
-
- if (process.noDeprecation === true) {
- return fn;
- }
-
- var warned = false;
- function deprecated() {
- if (!warned) {
- if (process.throwDeprecation) {
- throw new Error(msg);
- } else if (process.traceDeprecation) {
- console.trace(msg);
- } else {
- console.error(msg);
- }
- warned = true;
- }
- return fn.apply(this, arguments);
- }
-
- return deprecated;
-};
-
-
-var debugs = {};
-var debugEnviron;
-exports.debuglog = function(set) {
- if (isUndefined(debugEnviron))
- debugEnviron = process.env.NODE_DEBUG || '';
- set = set.toUpperCase();
- if (!debugs[set]) {
- if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) {
- var pid = process.pid;
- debugs[set] = function() {
- var msg = exports.format.apply(exports, arguments);
- console.error('%s %d: %s', set, pid, msg);
- };
- } else {
- debugs[set] = function() {};
- }
- }
- return debugs[set];
-};
-
-
-/**
- * Echos the value of a value. Trys to print the value out
- * in the best way possible given the different types.
- *
- * @param {Object} obj The object to print out.
- * @param {Object} opts Optional options object that alters the output.
- */
-/* legacy: obj, showHidden, depth, colors*/
-function inspect(obj, opts) {
- // default options
- var ctx = {
- seen: [],
- stylize: stylizeNoColor
- };
- // legacy...
- if (arguments.length >= 3) ctx.depth = arguments[2];
- if (arguments.length >= 4) ctx.colors = arguments[3];
- if (isBoolean(opts)) {
- // legacy...
- ctx.showHidden = opts;
- } else if (opts) {
- // got an "options" object
- exports._extend(ctx, opts);
- }
- // set default options
- if (isUndefined(ctx.showHidden)) ctx.showHidden = false;
- if (isUndefined(ctx.depth)) ctx.depth = 2;
- if (isUndefined(ctx.colors)) ctx.colors = false;
- if (isUndefined(ctx.customInspect)) ctx.customInspect = true;
- if (ctx.colors) ctx.stylize = stylizeWithColor;
- return formatValue(ctx, obj, ctx.depth);
-}
-exports.inspect = inspect;
-
-
-// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics
-inspect.colors = {
- 'bold' : [1, 22],
- 'italic' : [3, 23],
- 'underline' : [4, 24],
- 'inverse' : [7, 27],
- 'white' : [37, 39],
- 'grey' : [90, 39],
- 'black' : [30, 39],
- 'blue' : [34, 39],
- 'cyan' : [36, 39],
- 'green' : [32, 39],
- 'magenta' : [35, 39],
- 'red' : [31, 39],
- 'yellow' : [33, 39]
-};
-
-// Don't use 'blue' not visible on cmd.exe
-inspect.styles = {
- 'special': 'cyan',
- 'number': 'yellow',
- 'boolean': 'yellow',
- 'undefined': 'grey',
- 'null': 'bold',
- 'string': 'green',
- 'date': 'magenta',
- // "name": intentionally not styling
- 'regexp': 'red'
-};
-
-
-function stylizeWithColor(str, styleType) {
- var style = inspect.styles[styleType];
-
- if (style) {
- return '\u001b[' + inspect.colors[style][0] + 'm' + str +
- '\u001b[' + inspect.colors[style][1] + 'm';
- } else {
- return str;
- }
-}
-
-
-function stylizeNoColor(str, styleType) {
- return str;
-}
-
-
-function arrayToHash(array) {
- var hash = {};
-
- array.forEach(function(val, idx) {
- hash[val] = true;
- });
-
- return hash;
-}
-
-
-function formatValue(ctx, value, recurseTimes) {
- // Provide a hook for user-specified inspect functions.
- // Check that value is an object with an inspect function on it
- if (ctx.customInspect &&
- value &&
- isFunction(value.inspect) &&
- // Filter out the util module, it's inspect function is special
- value.inspect !== exports.inspect &&
- // Also filter out any prototype objects using the circular check.
- !(value.constructor && value.constructor.prototype === value)) {
- var ret = value.inspect(recurseTimes, ctx);
- if (!isString(ret)) {
- ret = formatValue(ctx, ret, recurseTimes);
- }
- return ret;
- }
-
- // Primitive types cannot have properties
- var primitive = formatPrimitive(ctx, value);
- if (primitive) {
- return primitive;
- }
-
- // Look up the keys of the object.
- var keys = Object.keys(value);
- var visibleKeys = arrayToHash(keys);
-
- if (ctx.showHidden) {
- keys = Object.getOwnPropertyNames(value);
- }
-
- // Some type of object without properties can be shortcutted.
- if (keys.length === 0) {
- if (isFunction(value)) {
- var name = value.name ? ': ' + value.name : '';
- return ctx.stylize('[Function' + name + ']', 'special');
- }
- if (isRegExp(value)) {
- return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp');
- }
- if (isDate(value)) {
- return ctx.stylize(Date.prototype.toString.call(value), 'date');
- }
- if (isError(value)) {
- return formatError(value);
- }
- }
-
- var base = '', array = false, braces = ['{', '}'];
-
- // Make Array say that they are Array
- if (isArray(value)) {
- array = true;
- braces = ['[', ']'];
- }
-
- // Make functions say that they are functions
- if (isFunction(value)) {
- var n = value.name ? ': ' + value.name : '';
- base = ' [Function' + n + ']';
- }
-
- // Make RegExps say that they are RegExps
- if (isRegExp(value)) {
- base = ' ' + RegExp.prototype.toString.call(value);
- }
-
- // Make dates with properties first say the date
- if (isDate(value)) {
- base = ' ' + Date.prototype.toUTCString.call(value);
- }
-
- // Make error with message first say the error
- if (isError(value)) {
- base = ' ' + formatError(value);
- }
-
- if (keys.length === 0 && (!array || value.length == 0)) {
- return braces[0] + base + braces[1];
- }
-
- if (recurseTimes < 0) {
- if (isRegExp(value)) {
- return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp');
- } else {
- return ctx.stylize('[Object]', 'special');
- }
- }
-
- ctx.seen.push(value);
-
- var output;
- if (array) {
- output = formatArray(ctx, value, recurseTimes, visibleKeys, keys);
- } else {
- output = keys.map(function(key) {
- return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array);
- });
- }
-
- ctx.seen.pop();
-
- return reduceToSingleString(output, base, braces);
-}
-
-
-function formatPrimitive(ctx, value) {
- if (isUndefined(value))
- return ctx.stylize('undefined', 'undefined');
- if (isString(value)) {
- var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '')
- .replace(/'/g, "\\'")
- .replace(/\\"/g, '"') + '\'';
- return ctx.stylize(simple, 'string');
- }
- if (isNumber(value)) {
- // Format -0 as '-0'. Strict equality won't distinguish 0 from -0,
- // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 .
- if (value === 0 && 1 / value < 0)
- return ctx.stylize('-0', 'number');
- return ctx.stylize('' + value, 'number');
- }
- if (isBoolean(value))
- return ctx.stylize('' + value, 'boolean');
- // For some reason typeof null is "object", so special case here.
- if (isNull(value))
- return ctx.stylize('null', 'null');
-}
-
-
-function formatError(value) {
- return '[' + Error.prototype.toString.call(value) + ']';
-}
-
-
-function formatArray(ctx, value, recurseTimes, visibleKeys, keys) {
- var output = [];
- for (var i = 0, l = value.length; i < l; ++i) {
- if (hasOwnProperty(value, String(i))) {
- output.push(formatProperty(ctx, value, recurseTimes, visibleKeys,
- String(i), true));
- } else {
- output.push('');
- }
- }
- keys.forEach(function(key) {
- if (!key.match(/^\d+$/)) {
- output.push(formatProperty(ctx, value, recurseTimes, visibleKeys,
- key, true));
- }
- });
- return output;
-}
-
-
-function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) {
- var name, str, desc;
- desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] };
- if (desc.get) {
- if (desc.set) {
- str = ctx.stylize('[Getter/Setter]', 'special');
- } else {
- str = ctx.stylize('[Getter]', 'special');
- }
- } else {
- if (desc.set) {
- str = ctx.stylize('[Setter]', 'special');
- }
- }
- if (!hasOwnProperty(visibleKeys, key)) {
- name = '[' + key + ']';
- }
- if (!str) {
- if (ctx.seen.indexOf(desc.value) < 0) {
- if (isNull(recurseTimes)) {
- str = formatValue(ctx, desc.value, null);
- } else {
- str = formatValue(ctx, desc.value, recurseTimes - 1);
- }
- if (str.indexOf('\n') > -1) {
- if (array) {
- str = str.split('\n').map(function(line) {
- return ' ' + line;
- }).join('\n').substr(2);
- } else {
- str = '\n' + str.split('\n').map(function(line) {
- return ' ' + line;
- }).join('\n');
- }
- }
- } else {
- str = ctx.stylize('[Circular]', 'special');
- }
- }
- if (isUndefined(name)) {
- if (array && key.match(/^\d+$/)) {
- return str;
- }
- name = JSON.stringify('' + key);
- if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) {
- name = name.substr(1, name.length - 2);
- name = ctx.stylize(name, 'name');
- } else {
- name = name.replace(/'/g, "\\'")
- .replace(/\\"/g, '"')
- .replace(/(^"|"$)/g, "'");
- name = ctx.stylize(name, 'string');
- }
- }
-
- return name + ': ' + str;
-}
-
-
-function reduceToSingleString(output, base, braces) {
- var numLinesEst = 0;
- var length = output.reduce(function(prev, cur) {
- numLinesEst++;
- if (cur.indexOf('\n') >= 0) numLinesEst++;
- return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1;
- }, 0);
-
- if (length > 60) {
- return braces[0] +
- (base === '' ? '' : base + '\n ') +
- ' ' +
- output.join(',\n ') +
- ' ' +
- braces[1];
- }
-
- return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1];
-}
-
-
// NOTE: These type checking functions intentionally don't use `instanceof`
// because it is fragile and can be easily faked with `Object.create()`.
function isArray(ar) {
@@ -522,166 +98,10 @@ function isPrimitive(arg) {
exports.isPrimitive = isPrimitive;
function isBuffer(arg) {
- return arg instanceof Buffer;
+ return Buffer.isBuffer(arg);
}
exports.isBuffer = isBuffer;
function objectToString(o) {
return Object.prototype.toString.call(o);
-}
-
-
-function pad(n) {
- return n < 10 ? '0' + n.toString(10) : n.toString(10);
-}
-
-
-var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep',
- 'Oct', 'Nov', 'Dec'];
-
-// 26 Feb 16:19:34
-function timestamp() {
- var d = new Date();
- var time = [pad(d.getHours()),
- pad(d.getMinutes()),
- pad(d.getSeconds())].join(':');
- return [d.getDate(), months[d.getMonth()], time].join(' ');
-}
-
-
-// log is just a thin wrapper to console.log that prepends a timestamp
-exports.log = function() {
- console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments));
-};
-
-
-/**
- * Inherit the prototype methods from one constructor into another.
- *
- * The Function.prototype.inherits from lang.js rewritten as a standalone
- * function (not on Function.prototype). NOTE: If this file is to be loaded
- * during bootstrapping this function needs to be rewritten using some native
- * functions as prototype setup using normal JavaScript does not work as
- * expected during bootstrapping (see mirror.js in r114903).
- *
- * @param {function} ctor Constructor function which needs to inherit the
- * prototype.
- * @param {function} superCtor Constructor function to inherit prototype from.
- */
-exports.inherits = function(ctor, superCtor) {
- ctor.super_ = superCtor;
- ctor.prototype = Object.create(superCtor.prototype, {
- constructor: {
- value: ctor,
- enumerable: false,
- writable: true,
- configurable: true
- }
- });
-};
-
-exports._extend = function(origin, add) {
- // Don't do anything if add isn't an object
- if (!add || !isObject(add)) return origin;
-
- var keys = Object.keys(add);
- var i = keys.length;
- while (i--) {
- origin[keys[i]] = add[keys[i]];
- }
- return origin;
-};
-
-function hasOwnProperty(obj, prop) {
- return Object.prototype.hasOwnProperty.call(obj, prop);
-}
-
-
-// Deprecated old stuff.
-
-exports.p = exports.deprecate(function() {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- console.error(exports.inspect(arguments[i]));
- }
-}, 'util.p: Use console.error() instead');
-
-
-exports.exec = exports.deprecate(function() {
- return require('child_process').exec.apply(this, arguments);
-}, 'util.exec is now called `child_process.exec`.');
-
-
-exports.print = exports.deprecate(function() {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- process.stdout.write(String(arguments[i]));
- }
-}, 'util.print: Use console.log instead');
-
-
-exports.puts = exports.deprecate(function() {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- process.stdout.write(arguments[i] + '\n');
- }
-}, 'util.puts: Use console.log instead');
-
-
-exports.debug = exports.deprecate(function(x) {
- process.stderr.write('DEBUG: ' + x + '\n');
-}, 'util.debug: Use console.error instead');
-
-
-exports.error = exports.deprecate(function(x) {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- process.stderr.write(arguments[i] + '\n');
- }
-}, 'util.error: Use console.error instead');
-
-
-exports.pump = exports.deprecate(function(readStream, writeStream, callback) {
- var callbackCalled = false;
-
- function call(a, b, c) {
- if (callback && !callbackCalled) {
- callback(a, b, c);
- callbackCalled = true;
- }
- }
-
- readStream.addListener('data', function(chunk) {
- if (writeStream.write(chunk) === false) readStream.pause();
- });
-
- writeStream.addListener('drain', function() {
- readStream.resume();
- });
-
- readStream.addListener('end', function() {
- writeStream.end();
- });
-
- readStream.addListener('close', function() {
- call();
- });
-
- readStream.addListener('error', function(err) {
- writeStream.end();
- call(err);
- });
-
- writeStream.addListener('error', function(err) {
- readStream.destroy();
- call(err);
- });
-}, 'util.pump(): Use readableStream.pipe() instead');
-
-
-var uv;
-exports._errnoException = function(err, syscall) {
- if (isUndefined(uv)) uv = process.binding('uv');
- var errname = uv.errname(err);
- var e = new Error(syscall + ' ' + errname);
- e.code = errname;
- e.errno = errname;
- e.syscall = syscall;
- return e;
-};
+}

View File

@@ -0,0 +1,107 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// NOTE: These type checking functions intentionally don't use `instanceof`
// because it is fragile and can be easily faked with `Object.create()`.
function isArray(ar) {
return Array.isArray(ar);
}
exports.isArray = isArray;
function isBoolean(arg) {
return typeof arg === 'boolean';
}
exports.isBoolean = isBoolean;
function isNull(arg) {
return arg === null;
}
exports.isNull = isNull;
function isNullOrUndefined(arg) {
return arg == null;
}
exports.isNullOrUndefined = isNullOrUndefined;
function isNumber(arg) {
return typeof arg === 'number';
}
exports.isNumber = isNumber;
function isString(arg) {
return typeof arg === 'string';
}
exports.isString = isString;
function isSymbol(arg) {
return typeof arg === 'symbol';
}
exports.isSymbol = isSymbol;
function isUndefined(arg) {
return arg === void 0;
}
exports.isUndefined = isUndefined;
function isRegExp(re) {
return isObject(re) && objectToString(re) === '[object RegExp]';
}
exports.isRegExp = isRegExp;
function isObject(arg) {
return typeof arg === 'object' && arg !== null;
}
exports.isObject = isObject;
function isDate(d) {
return isObject(d) && objectToString(d) === '[object Date]';
}
exports.isDate = isDate;
function isError(e) {
return isObject(e) &&
(objectToString(e) === '[object Error]' || e instanceof Error);
}
exports.isError = isError;
function isFunction(arg) {
return typeof arg === 'function';
}
exports.isFunction = isFunction;
function isPrimitive(arg) {
return arg === null ||
typeof arg === 'boolean' ||
typeof arg === 'number' ||
typeof arg === 'string' ||
typeof arg === 'symbol' || // ES6 symbol
typeof arg === 'undefined';
}
exports.isPrimitive = isPrimitive;
function isBuffer(arg) {
return Buffer.isBuffer(arg);
}
exports.isBuffer = isBuffer;
function objectToString(o) {
return Object.prototype.toString.call(o);
}

View File

@@ -0,0 +1,37 @@
{
"name": "core-util-is",
"version": "1.0.1",
"description": "The `util.is*` functions introduced in Node v0.12.",
"main": "lib/util.js",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/core-util-is.git"
},
"keywords": [
"util",
"isBuffer",
"isArray",
"isNumber",
"isString",
"isRegExp",
"isThis",
"isThat",
"polyfill"
],
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
"url": "http://blog.izs.me/"
},
"license": "MIT",
"bugs": {
"url": "https://github.com/isaacs/core-util-is/issues"
},
"readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n",
"readmeFilename": "README.md",
"homepage": "https://github.com/isaacs/core-util-is#readme",
"_id": "core-util-is@1.0.1",
"_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538",
"_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz",
"_from": "core-util-is@>=1.0.0 <1.1.0"
}

View File

@@ -0,0 +1,106 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// NOTE: These type checking functions intentionally don't use `instanceof`
// because it is fragile and can be easily faked with `Object.create()`.
function isArray(ar) {
return Array.isArray(ar);
}
exports.isArray = isArray;
function isBoolean(arg) {
return typeof arg === 'boolean';
}
exports.isBoolean = isBoolean;
function isNull(arg) {
return arg === null;
}
exports.isNull = isNull;
function isNullOrUndefined(arg) {
return arg == null;
}
exports.isNullOrUndefined = isNullOrUndefined;
function isNumber(arg) {
return typeof arg === 'number';
}
exports.isNumber = isNumber;
function isString(arg) {
return typeof arg === 'string';
}
exports.isString = isString;
function isSymbol(arg) {
return typeof arg === 'symbol';
}
exports.isSymbol = isSymbol;
function isUndefined(arg) {
return arg === void 0;
}
exports.isUndefined = isUndefined;
function isRegExp(re) {
return isObject(re) && objectToString(re) === '[object RegExp]';
}
exports.isRegExp = isRegExp;
function isObject(arg) {
return typeof arg === 'object' && arg !== null;
}
exports.isObject = isObject;
function isDate(d) {
return isObject(d) && objectToString(d) === '[object Date]';
}
exports.isDate = isDate;
function isError(e) {
return isObject(e) && objectToString(e) === '[object Error]';
}
exports.isError = isError;
function isFunction(arg) {
return typeof arg === 'function';
}
exports.isFunction = isFunction;
function isPrimitive(arg) {
return arg === null ||
typeof arg === 'boolean' ||
typeof arg === 'number' ||
typeof arg === 'string' ||
typeof arg === 'symbol' || // ES6 symbol
typeof arg === 'undefined';
}
exports.isPrimitive = isPrimitive;
function isBuffer(arg) {
return arg instanceof Buffer;
}
exports.isBuffer = isBuffer;
function objectToString(o) {
return Object.prototype.toString.call(o);
}

View File

@@ -0,0 +1,16 @@
The ISC License
Copyright (c) Isaac Z. Schlueter
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
PERFORMANCE OF THIS SOFTWARE.

View File

@@ -0,0 +1,42 @@
Browser-friendly inheritance fully compatible with standard node.js
[inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor).
This package exports standard `inherits` from node.js `util` module in
node environment, but also provides alternative browser-friendly
implementation through [browser
field](https://gist.github.com/shtylman/4339901). Alternative
implementation is a literal copy of standard one located in standalone
module to avoid requiring of `util`. It also has a shim for old
browsers with no `Object.create` support.
While keeping you sure you are using standard `inherits`
implementation in node.js environment, it allows bundlers such as
[browserify](https://github.com/substack/node-browserify) to not
include full `util` package to your client code if all you need is
just `inherits` function. It worth, because browser shim for `util`
package is large and `inherits` is often the single function you need
from it.
It's recommended to use this package instead of
`require('util').inherits` for any code that has chances to be used
not only in node.js but in browser too.
## usage
```js
var inherits = require('inherits');
// then use exactly as the standard one
```
## note on version ~1.0
Version ~1.0 had completely different motivation and is not compatible
neither with 2.0 nor with standard node.js `inherits`.
If you are using version ~1.0 and planning to switch to ~2.0, be
careful:
* new version uses `super_` instead of `super` for referencing
superclass
* new version overwrites current prototype while old one preserves any
existing fields on it

View File

@@ -0,0 +1 @@
module.exports = require('util').inherits

View File

@@ -0,0 +1,23 @@
if (typeof Object.create === 'function') {
// implementation from standard node.js 'util' module
module.exports = function inherits(ctor, superCtor) {
ctor.super_ = superCtor
ctor.prototype = Object.create(superCtor.prototype, {
constructor: {
value: ctor,
enumerable: false,
writable: true,
configurable: true
}
});
};
} else {
// old school shim for old browsers
module.exports = function inherits(ctor, superCtor) {
ctor.super_ = superCtor
var TempCtor = function () {}
TempCtor.prototype = superCtor.prototype
ctor.prototype = new TempCtor()
ctor.prototype.constructor = ctor
}
}

View File

@@ -0,0 +1,35 @@
{
"name": "inherits",
"description": "Browser-friendly inheritance fully compatible with standard node.js inherits()",
"version": "2.0.1",
"keywords": [
"inheritance",
"class",
"klass",
"oop",
"object-oriented",
"inherits",
"browser",
"browserify"
],
"main": "./inherits.js",
"browser": "./inherits_browser.js",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/inherits.git"
},
"license": "ISC",
"scripts": {
"test": "node test"
},
"readme": "Browser-friendly inheritance fully compatible with standard node.js\n[inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor).\n\nThis package exports standard `inherits` from node.js `util` module in\nnode environment, but also provides alternative browser-friendly\nimplementation through [browser\nfield](https://gist.github.com/shtylman/4339901). Alternative\nimplementation is a literal copy of standard one located in standalone\nmodule to avoid requiring of `util`. It also has a shim for old\nbrowsers with no `Object.create` support.\n\nWhile keeping you sure you are using standard `inherits`\nimplementation in node.js environment, it allows bundlers such as\n[browserify](https://github.com/substack/node-browserify) to not\ninclude full `util` package to your client code if all you need is\njust `inherits` function. It worth, because browser shim for `util`\npackage is large and `inherits` is often the single function you need\nfrom it.\n\nIt's recommended to use this package instead of\n`require('util').inherits` for any code that has chances to be used\nnot only in node.js but in browser too.\n\n## usage\n\n```js\nvar inherits = require('inherits');\n// then use exactly as the standard one\n```\n\n## note on version ~1.0\n\nVersion ~1.0 had completely different motivation and is not compatible\nneither with 2.0 nor with standard node.js `inherits`.\n\nIf you are using version ~1.0 and planning to switch to ~2.0, be\ncareful:\n\n* new version uses `super_` instead of `super` for referencing\n superclass\n* new version overwrites current prototype while old one preserves any\n existing fields on it\n",
"readmeFilename": "README.md",
"bugs": {
"url": "https://github.com/isaacs/inherits/issues"
},
"homepage": "https://github.com/isaacs/inherits#readme",
"_id": "inherits@2.0.1",
"_shasum": "b17d08d326b4423e568eff719f91b0b1cbdf69f1",
"_resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.1.tgz",
"_from": "inherits@>=2.0.1 <2.1.0"
}

View File

@@ -0,0 +1,25 @@
var inherits = require('./inherits.js')
var assert = require('assert')
function test(c) {
assert(c.constructor === Child)
assert(c.constructor.super_ === Parent)
assert(Object.getPrototypeOf(c) === Child.prototype)
assert(Object.getPrototypeOf(Object.getPrototypeOf(c)) === Parent.prototype)
assert(c instanceof Child)
assert(c instanceof Parent)
}
function Child() {
Parent.call(this)
test(this)
}
function Parent() {}
inherits(Child, Parent)
var c = new Child
test(c)
console.log('ok')

View File

@@ -0,0 +1,54 @@
# isarray
`Array#isArray` for older browsers.
## Usage
```js
var isArray = require('isarray');
console.log(isArray([])); // => true
console.log(isArray({})); // => false
```
## Installation
With [npm](http://npmjs.org) do
```bash
$ npm install isarray
```
Then bundle for the browser with
[browserify](https://github.com/substack/browserify).
With [component](http://component.io) do
```bash
$ component install juliangruber/isarray
```
## License
(MIT)
Copyright (c) 2013 Julian Gruber &lt;julian@juliangruber.com&gt;
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -0,0 +1,209 @@
/**
* Require the given path.
*
* @param {String} path
* @return {Object} exports
* @api public
*/
function require(path, parent, orig) {
var resolved = require.resolve(path);
// lookup failed
if (null == resolved) {
orig = orig || path;
parent = parent || 'root';
var err = new Error('Failed to require "' + orig + '" from "' + parent + '"');
err.path = orig;
err.parent = parent;
err.require = true;
throw err;
}
var module = require.modules[resolved];
// perform real require()
// by invoking the module's
// registered function
if (!module.exports) {
module.exports = {};
module.client = module.component = true;
module.call(this, module.exports, require.relative(resolved), module);
}
return module.exports;
}
/**
* Registered modules.
*/
require.modules = {};
/**
* Registered aliases.
*/
require.aliases = {};
/**
* Resolve `path`.
*
* Lookup:
*
* - PATH/index.js
* - PATH.js
* - PATH
*
* @param {String} path
* @return {String} path or null
* @api private
*/
require.resolve = function(path) {
if (path.charAt(0) === '/') path = path.slice(1);
var index = path + '/index.js';
var paths = [
path,
path + '.js',
path + '.json',
path + '/index.js',
path + '/index.json'
];
for (var i = 0; i < paths.length; i++) {
var path = paths[i];
if (require.modules.hasOwnProperty(path)) return path;
}
if (require.aliases.hasOwnProperty(index)) {
return require.aliases[index];
}
};
/**
* Normalize `path` relative to the current path.
*
* @param {String} curr
* @param {String} path
* @return {String}
* @api private
*/
require.normalize = function(curr, path) {
var segs = [];
if ('.' != path.charAt(0)) return path;
curr = curr.split('/');
path = path.split('/');
for (var i = 0; i < path.length; ++i) {
if ('..' == path[i]) {
curr.pop();
} else if ('.' != path[i] && '' != path[i]) {
segs.push(path[i]);
}
}
return curr.concat(segs).join('/');
};
/**
* Register module at `path` with callback `definition`.
*
* @param {String} path
* @param {Function} definition
* @api private
*/
require.register = function(path, definition) {
require.modules[path] = definition;
};
/**
* Alias a module definition.
*
* @param {String} from
* @param {String} to
* @api private
*/
require.alias = function(from, to) {
if (!require.modules.hasOwnProperty(from)) {
throw new Error('Failed to alias "' + from + '", it does not exist');
}
require.aliases[to] = from;
};
/**
* Return a require function relative to the `parent` path.
*
* @param {String} parent
* @return {Function}
* @api private
*/
require.relative = function(parent) {
var p = require.normalize(parent, '..');
/**
* lastIndexOf helper.
*/
function lastIndexOf(arr, obj) {
var i = arr.length;
while (i--) {
if (arr[i] === obj) return i;
}
return -1;
}
/**
* The relative require() itself.
*/
function localRequire(path) {
var resolved = localRequire.resolve(path);
return require(resolved, parent, path);
}
/**
* Resolve relative to the parent.
*/
localRequire.resolve = function(path) {
var c = path.charAt(0);
if ('/' == c) return path.slice(1);
if ('.' == c) return require.normalize(p, path);
// resolve deps by returning
// the dep in the nearest "deps"
// directory
var segs = parent.split('/');
var i = lastIndexOf(segs, 'deps') + 1;
if (!i) i = 0;
path = segs.slice(0, i + 1).join('/') + '/deps/' + path;
return path;
};
/**
* Check if module is defined at `path`.
*/
localRequire.exists = function(path) {
return require.modules.hasOwnProperty(localRequire.resolve(path));
};
return localRequire;
};
require.register("isarray/index.js", function(exports, require, module){
module.exports = Array.isArray || function (arr) {
return Object.prototype.toString.call(arr) == '[object Array]';
};
});
require.alias("isarray/index.js", "isarray/index.js");

View File

@@ -0,0 +1,19 @@
{
"name" : "isarray",
"description" : "Array#isArray for older browsers",
"version" : "0.0.1",
"repository" : "juliangruber/isarray",
"homepage": "https://github.com/juliangruber/isarray",
"main" : "index.js",
"scripts" : [
"index.js"
],
"dependencies" : {},
"keywords": ["browser","isarray","array"],
"author": {
"name": "Julian Gruber",
"email": "mail@juliangruber.com",
"url": "http://juliangruber.com"
},
"license": "MIT"
}

View File

@@ -0,0 +1,3 @@
module.exports = Array.isArray || function (arr) {
return Object.prototype.toString.call(arr) == '[object Array]';
};

View File

@@ -0,0 +1,38 @@
{
"name": "isarray",
"description": "Array#isArray for older browsers",
"version": "0.0.1",
"repository": {
"type": "git",
"url": "git://github.com/juliangruber/isarray.git"
},
"homepage": "https://github.com/juliangruber/isarray",
"main": "index.js",
"scripts": {
"test": "tap test/*.js"
},
"dependencies": {},
"devDependencies": {
"tap": "*"
},
"keywords": [
"browser",
"isarray",
"array"
],
"author": {
"name": "Julian Gruber",
"email": "mail@juliangruber.com",
"url": "http://juliangruber.com"
},
"license": "MIT",
"readme": "\n# isarray\n\n`Array#isArray` for older browsers.\n\n## Usage\n\n```js\nvar isArray = require('isarray');\n\nconsole.log(isArray([])); // => true\nconsole.log(isArray({})); // => false\n```\n\n## Installation\n\nWith [npm](http://npmjs.org) do\n\n```bash\n$ npm install isarray\n```\n\nThen bundle for the browser with\n[browserify](https://github.com/substack/browserify).\n\nWith [component](http://component.io) do\n\n```bash\n$ component install juliangruber/isarray\n```\n\n## License\n\n(MIT)\n\nCopyright (c) 2013 Julian Gruber &lt;julian@juliangruber.com&gt;\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n",
"readmeFilename": "README.md",
"bugs": {
"url": "https://github.com/juliangruber/isarray/issues"
},
"_id": "isarray@0.0.1",
"_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf",
"_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz",
"_from": "isarray@0.0.1"
}

View File

@@ -0,0 +1,20 @@
Copyright Joyent, Inc. and other Node contributors.
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to permit
persons to whom the Software is furnished to do so, subject to the
following conditions:
The above copyright notice and this permission notice shall be included
in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
USE OR OTHER DEALINGS IN THE SOFTWARE.

View File

@@ -0,0 +1,7 @@
**string_decoder.js** (`require('string_decoder')`) from Node.js core
Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details.
Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.**
The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.

View File

@@ -0,0 +1,221 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
var Buffer = require('buffer').Buffer;
var isBufferEncoding = Buffer.isEncoding
|| function(encoding) {
switch (encoding && encoding.toLowerCase()) {
case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true;
default: return false;
}
}
function assertEncoding(encoding) {
if (encoding && !isBufferEncoding(encoding)) {
throw new Error('Unknown encoding: ' + encoding);
}
}
// StringDecoder provides an interface for efficiently splitting a series of
// buffers into a series of JS strings without breaking apart multi-byte
// characters. CESU-8 is handled as part of the UTF-8 encoding.
//
// @TODO Handling all encodings inside a single object makes it very difficult
// to reason about this code, so it should be split up in the future.
// @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code
// points as used by CESU-8.
var StringDecoder = exports.StringDecoder = function(encoding) {
this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, '');
assertEncoding(encoding);
switch (this.encoding) {
case 'utf8':
// CESU-8 represents each of Surrogate Pair by 3-bytes
this.surrogateSize = 3;
break;
case 'ucs2':
case 'utf16le':
// UTF-16 represents each of Surrogate Pair by 2-bytes
this.surrogateSize = 2;
this.detectIncompleteChar = utf16DetectIncompleteChar;
break;
case 'base64':
// Base-64 stores 3 bytes in 4 chars, and pads the remainder.
this.surrogateSize = 3;
this.detectIncompleteChar = base64DetectIncompleteChar;
break;
default:
this.write = passThroughWrite;
return;
}
// Enough space to store all bytes of a single character. UTF-8 needs 4
// bytes, but CESU-8 may require up to 6 (3 bytes per surrogate).
this.charBuffer = new Buffer(6);
// Number of bytes received for the current incomplete multi-byte character.
this.charReceived = 0;
// Number of bytes expected for the current incomplete multi-byte character.
this.charLength = 0;
};
// write decodes the given buffer and returns it as JS string that is
// guaranteed to not contain any partial multi-byte characters. Any partial
// character found at the end of the buffer is buffered up, and will be
// returned when calling write again with the remaining bytes.
//
// Note: Converting a Buffer containing an orphan surrogate to a String
// currently works, but converting a String to a Buffer (via `new Buffer`, or
// Buffer#write) will replace incomplete surrogates with the unicode
// replacement character. See https://codereview.chromium.org/121173009/ .
StringDecoder.prototype.write = function(buffer) {
var charStr = '';
// if our last write ended with an incomplete multibyte character
while (this.charLength) {
// determine how many remaining bytes this buffer has to offer for this char
var available = (buffer.length >= this.charLength - this.charReceived) ?
this.charLength - this.charReceived :
buffer.length;
// add the new bytes to the char buffer
buffer.copy(this.charBuffer, this.charReceived, 0, available);
this.charReceived += available;
if (this.charReceived < this.charLength) {
// still not enough chars in this buffer? wait for more ...
return '';
}
// remove bytes belonging to the current character from the buffer
buffer = buffer.slice(available, buffer.length);
// get the character that was split
charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding);
// CESU-8: lead surrogate (D800-DBFF) is also the incomplete character
var charCode = charStr.charCodeAt(charStr.length - 1);
if (charCode >= 0xD800 && charCode <= 0xDBFF) {
this.charLength += this.surrogateSize;
charStr = '';
continue;
}
this.charReceived = this.charLength = 0;
// if there are no more bytes in this buffer, just emit our char
if (buffer.length === 0) {
return charStr;
}
break;
}
// determine and set charLength / charReceived
this.detectIncompleteChar(buffer);
var end = buffer.length;
if (this.charLength) {
// buffer the incomplete character bytes we got
buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end);
end -= this.charReceived;
}
charStr += buffer.toString(this.encoding, 0, end);
var end = charStr.length - 1;
var charCode = charStr.charCodeAt(end);
// CESU-8: lead surrogate (D800-DBFF) is also the incomplete character
if (charCode >= 0xD800 && charCode <= 0xDBFF) {
var size = this.surrogateSize;
this.charLength += size;
this.charReceived += size;
this.charBuffer.copy(this.charBuffer, size, 0, size);
buffer.copy(this.charBuffer, 0, 0, size);
return charStr.substring(0, end);
}
// or just emit the charStr
return charStr;
};
// detectIncompleteChar determines if there is an incomplete UTF-8 character at
// the end of the given buffer. If so, it sets this.charLength to the byte
// length that character, and sets this.charReceived to the number of bytes
// that are available for this character.
StringDecoder.prototype.detectIncompleteChar = function(buffer) {
// determine how many bytes we have to check at the end of this buffer
var i = (buffer.length >= 3) ? 3 : buffer.length;
// Figure out if one of the last i bytes of our buffer announces an
// incomplete char.
for (; i > 0; i--) {
var c = buffer[buffer.length - i];
// See http://en.wikipedia.org/wiki/UTF-8#Description
// 110XXXXX
if (i == 1 && c >> 5 == 0x06) {
this.charLength = 2;
break;
}
// 1110XXXX
if (i <= 2 && c >> 4 == 0x0E) {
this.charLength = 3;
break;
}
// 11110XXX
if (i <= 3 && c >> 3 == 0x1E) {
this.charLength = 4;
break;
}
}
this.charReceived = i;
};
StringDecoder.prototype.end = function(buffer) {
var res = '';
if (buffer && buffer.length)
res = this.write(buffer);
if (this.charReceived) {
var cr = this.charReceived;
var buf = this.charBuffer;
var enc = this.encoding;
res += buf.slice(0, cr).toString(enc);
}
return res;
};
function passThroughWrite(buffer) {
return buffer.toString(this.encoding);
}
function utf16DetectIncompleteChar(buffer) {
this.charReceived = buffer.length % 2;
this.charLength = this.charReceived ? 2 : 0;
}
function base64DetectIncompleteChar(buffer) {
this.charReceived = buffer.length % 3;
this.charLength = this.charReceived ? 3 : 0;
}

View File

@@ -0,0 +1,34 @@
{
"name": "string_decoder",
"version": "0.10.31",
"description": "The string_decoder module from Node core",
"main": "index.js",
"dependencies": {},
"devDependencies": {
"tap": "~0.4.8"
},
"scripts": {
"test": "tap test/simple/*.js"
},
"repository": {
"type": "git",
"url": "git://github.com/rvagg/string_decoder.git"
},
"homepage": "https://github.com/rvagg/string_decoder",
"keywords": [
"string",
"decoder",
"browser",
"browserify"
],
"license": "MIT",
"readme": "**string_decoder.js** (`require('string_decoder')`) from Node.js core\n\nCopyright Joyent, Inc. and other Node contributors. See LICENCE file for details.\n\nVersion numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.**\n\nThe *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.",
"readmeFilename": "README.md",
"bugs": {
"url": "https://github.com/rvagg/string_decoder/issues"
},
"_id": "string_decoder@0.10.31",
"_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94",
"_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz",
"_from": "string_decoder@>=0.10.0 <0.11.0"
}

View File

@@ -0,0 +1,46 @@
{
"name": "readable-stream",
"version": "1.1.13",
"description": "Streams3, a user-land copy of the stream library from Node.js v0.11.x",
"main": "readable.js",
"dependencies": {
"core-util-is": "~1.0.0",
"isarray": "0.0.1",
"string_decoder": "~0.10.x",
"inherits": "~2.0.1"
},
"devDependencies": {
"tap": "~0.2.6"
},
"scripts": {
"test": "tap test/simple/*.js"
},
"repository": {
"type": "git",
"url": "git://github.com/isaacs/readable-stream.git"
},
"keywords": [
"readable",
"stream",
"pipe"
],
"browser": {
"util": false
},
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
"url": "http://blog.izs.me/"
},
"license": "MIT",
"readme": "# readable-stream\n\n***Node-core streams for userland***\n\n[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)\n[![NPM](https://nodei.co/npm-dl/readable-stream.png&months=6&height=3)](https://nodei.co/npm/readable-stream/)\n\nThis package is a mirror of the Streams2 and Streams3 implementations in Node-core.\n\nIf you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *\"stream\"* module in Node-core.\n\n**readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12.\n\n**readable-stream** uses proper patch-level versioning so if you pin to `\"~1.0.0\"` youll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when youre ready to start using Streams3, pin to `\"~1.1.0\"`\n\n",
"readmeFilename": "README.md",
"bugs": {
"url": "https://github.com/isaacs/readable-stream/issues"
},
"homepage": "https://github.com/isaacs/readable-stream#readme",
"_id": "readable-stream@1.1.13",
"_shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e",
"_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz",
"_from": "readable-stream@>=1.1.9 <1.2.0"
}

View File

@@ -0,0 +1 @@
module.exports = require("./lib/_stream_passthrough.js")

View File

@@ -0,0 +1,7 @@
exports = module.exports = require('./lib/_stream_readable.js');
exports.Stream = require('stream');
exports.Readable = exports;
exports.Writable = require('./lib/_stream_writable.js');
exports.Duplex = require('./lib/_stream_duplex.js');
exports.Transform = require('./lib/_stream_transform.js');
exports.PassThrough = require('./lib/_stream_passthrough.js');

View File

@@ -0,0 +1 @@
module.exports = require("./lib/_stream_transform.js")

View File

@@ -0,0 +1 @@
module.exports = require("./lib/_stream_writable.js")

View File

@@ -0,0 +1,42 @@
{
"name": "duplexer2",
"version": "0.0.2",
"description": "Like duplexer (http://npm.im/duplexer) but using streams2",
"main": "index.js",
"scripts": {
"test": "mocha -R tap"
},
"repository": {
"type": "git",
"url": "git://github.com/deoxxa/duplexer2.git"
},
"keywords": [
"duplex",
"stream",
"join",
"combine"
],
"author": {
"name": "Conrad Pankoff",
"email": "deoxxa@fknsrs.biz",
"url": "http://www.fknsrs.biz/"
},
"license": "BSD",
"bugs": {
"url": "https://github.com/deoxxa/duplexer2/issues"
},
"devDependencies": {
"chai": "~1.7.2",
"mocha": "~1.12.1"
},
"dependencies": {
"readable-stream": "~1.1.9"
},
"readme": "duplexer2 [![build status](https://travis-ci.org/deoxxa/duplexer2.png)](https://travis-ci.org/deoxxa/fork)\n=========\n\nLike duplexer (http://npm.im/duplexer) but using streams2.\n\nOverview\n--------\n\nduplexer2 is a reimplementation of [duplexer](http://npm.im/duplexer) using the\nreadable-stream API which is standard in node as of v0.10. Everything largely\nworks the same.\n\nInstallation\n------------\n\nAvailable via [npm](http://npmjs.org/):\n\n> $ npm install duplexer2\n\nOr via git:\n\n> $ git clone git://github.com/deoxxa/duplexer2.git node_modules/duplexer2\n\nAPI\n---\n\n**duplexer2**\n\nCreates a new `DuplexWrapper` object, which is the actual class that implements\nmost of the fun stuff. All that fun stuff is hidden. DON'T LOOK.\n\n```javascript\nduplexer2([options], writable, readable)\n```\n\n```javascript\nvar duplex = duplexer2(new stream.Writable(), new stream.Readable());\n```\n\nArguments\n\n* __options__ - an object specifying the regular `stream.Duplex` options, as\n well as the properties described below.\n* __writable__ - a writable stream\n* __readable__ - a readable stream\n\nOptions\n\n* __bubbleErrors__ - a boolean value that specifies whether to bubble errors\n from the underlying readable/writable streams. Default is `true`.\n\nExample\n-------\n\nAlso see [example.js](https://github.com/deoxxa/duplexer2/blob/master/example.js).\n\nCode:\n\n```javascript\nvar stream = require(\"stream\");\n\nvar duplexer2 = require(\"duplexer2\");\n\nvar writable = new stream.Writable({objectMode: true}),\n readable = new stream.Readable({objectMode: true});\n\nwritable._write = function _write(input, encoding, done) {\n if (readable.push(input)) {\n return done();\n } else {\n readable.once(\"drain\", done);\n }\n};\n\nreadable._read = function _read(n) {\n // no-op\n};\n\n// simulate the readable thing closing after a bit\nwritable.once(\"finish\", function() {\n setTimeout(function() {\n readable.push(null);\n }, 500);\n});\n\nvar duplex = duplexer2(writable, readable);\n\nduplex.on(\"data\", function(e) {\n console.log(\"got data\", JSON.stringify(e));\n});\n\nduplex.on(\"finish\", function() {\n console.log(\"got finish event\");\n});\n\nduplex.on(\"end\", function() {\n console.log(\"got end event\");\n});\n\nduplex.write(\"oh, hi there\", function() {\n console.log(\"finished writing\");\n});\n\nduplex.end(function() {\n console.log(\"finished ending\");\n});\n```\n\nOutput:\n\n```\ngot data \"oh, hi there\"\nfinished writing\ngot finish event\nfinished ending\ngot end event\n```\n\nLicense\n-------\n\n3-clause BSD. A copy is included with the source.\n\nContact\n-------\n\n* GitHub ([deoxxa](http://github.com/deoxxa))\n* Twitter ([@deoxxa](http://twitter.com/deoxxa))\n* Email ([deoxxa@fknsrs.biz](mailto:deoxxa@fknsrs.biz))\n",
"readmeFilename": "README.md",
"homepage": "https://github.com/deoxxa/duplexer2#readme",
"_id": "duplexer2@0.0.2",
"_shasum": "c614dcf67e2fb14995a91711e5a617e8a60a31db",
"_resolved": "https://registry.npmjs.org/duplexer2/-/duplexer2-0.0.2.tgz",
"_from": "duplexer2@>=0.0.2 <0.1.0"
}

View File

@@ -0,0 +1,161 @@
var assert = require("chai").assert;
var stream = require("readable-stream");
var duplexer2 = require("../");
describe("duplexer2", function() {
var writable, readable;
beforeEach(function() {
writable = new stream.Writable({objectMode: true});
readable = new stream.Readable({objectMode: true});
writable._write = function _write(input, encoding, done) {
return done();
};
readable._read = function _read(n) {
};
});
it("should interact with the writable stream properly for writing", function(done) {
var duplex = duplexer2(writable, readable);
writable._write = function _write(input, encoding, _done) {
assert.strictEqual(input, "well hello there");
return done();
};
duplex.write("well hello there");
});
it("should interact with the readable stream properly for reading", function(done) {
var duplex = duplexer2(writable, readable);
duplex.on("data", function(e) {
assert.strictEqual(e, "well hello there");
return done();
});
readable.push("well hello there");
});
it("should end the writable stream, causing it to finish", function(done) {
var duplex = duplexer2(writable, readable);
writable.once("finish", done);
duplex.end();
});
it("should finish when the writable stream finishes", function(done) {
var duplex = duplexer2(writable, readable);
duplex.once("finish", done);
writable.end();
});
it("should end when the readable stream ends", function(done) {
var duplex = duplexer2(writable, readable);
// required to let "end" fire without reading
duplex.resume();
duplex.once("end", done);
readable.push(null);
});
it("should bubble errors from the writable stream when no behaviour is specified", function(done) {
var duplex = duplexer2(writable, readable);
var originalErr = Error("testing");
duplex.on("error", function(err) {
assert.strictEqual(err, originalErr);
return done();
});
writable.emit("error", originalErr);
});
it("should bubble errors from the readable stream when no behaviour is specified", function(done) {
var duplex = duplexer2(writable, readable);
var originalErr = Error("testing");
duplex.on("error", function(err) {
assert.strictEqual(err, originalErr);
return done();
});
readable.emit("error", originalErr);
});
it("should bubble errors from the writable stream when bubbleErrors is true", function(done) {
var duplex = duplexer2({bubbleErrors: true}, writable, readable);
var originalErr = Error("testing");
duplex.on("error", function(err) {
assert.strictEqual(err, originalErr);
return done();
});
writable.emit("error", originalErr);
});
it("should bubble errors from the readable stream when bubbleErrors is true", function(done) {
var duplex = duplexer2({bubbleErrors: true}, writable, readable);
var originalErr = Error("testing");
duplex.on("error", function(err) {
assert.strictEqual(err, originalErr);
return done();
});
readable.emit("error", originalErr);
});
it("should not bubble errors from the writable stream when bubbleErrors is false", function(done) {
var duplex = duplexer2({bubbleErrors: false}, writable, readable);
var timeout = setTimeout(done, 25);
duplex.on("error", function(err) {
clearTimeout(timeout);
return done(Error("shouldn't bubble error"));
});
// prevent uncaught error exception
writable.on("error", function() {});
writable.emit("error", Error("testing"));
});
it("should not bubble errors from the readable stream when bubbleErrors is false", function(done) {
var duplex = duplexer2({bubbleErrors: false}, writable, readable);
var timeout = setTimeout(done, 25);
duplex.on("error", function(err) {
clearTimeout(timeout);
return done(Error("shouldn't bubble error"));
});
// prevent uncaught error exception
readable.on("error", function() {});
readable.emit("error", Error("testing"));
});
});

View File

@@ -0,0 +1,3 @@
test
.jshintrc
.travis.yml

View File

@@ -0,0 +1,39 @@
Copyright 2013, Rod Vagg (the "Original Author")
All rights reserved.
MIT +no-false-attribs License
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use,
copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
Distributions of all or part of the Software intended to be used
by the recipients as they would use the unmodified Software,
containing modifications that substantially alter, remove, or
disable functionality of the Software, outside of the documented
configuration mechanisms provided by the Software, shall be
modified such that the Original Author's bug reporting email
addresses and urls are either replaced with the contact information
of the parties responsible for the changes, or removed entirely.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE.
Except where noted, this license applies to any and all software
programs and associated documentation files created by the
Original Author, when distributed with the Software.

View File

@@ -0,0 +1,132 @@
# through2
[![NPM](https://nodei.co/npm/through2.png?downloads&downloadRank)](https://nodei.co/npm/through2/)
**A tiny wrapper around Node streams.Transform (Streams2) to avoid explicit subclassing noise**
Inspired by [Dominic Tarr](https://github.com/dominictarr)'s [through](https://github.com/dominictarr/through) in that it's so much easier to make a stream out of a function than it is to set up the prototype chain properly: `through(function (chunk) { ... })`.
Note: A **Streams3** version of through2 is available in npm with the tag `"1.0"` rather than `"latest"` so an `npm install through2` will get you the current Streams2 version (version number is 0.x.x). To use a Streams3 version use `npm install through2@1` to fetch the latest version 1.x.x. More information about Streams2 vs Streams3 and recommendations see the article **[Why I don't use Node's core 'stream' module](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html)**.
```js
fs.createReadStream('ex.txt')
.pipe(through2(function (chunk, enc, callback) {
for (var i = 0; i < chunk.length; i++)
if (chunk[i] == 97)
chunk[i] = 122 // swap 'a' for 'z'
this.push(chunk)
callback()
}))
.pipe(fs.createWriteStream('out.txt'))
```
Or object streams:
```js
var all = []
fs.createReadStream('data.csv')
.pipe(csv2())
.pipe(through2.obj(function (chunk, enc, callback) {
var data = {
name : chunk[0]
, address : chunk[3]
, phone : chunk[10]
}
this.push(data)
callback()
}))
.on('data', function (data) {
all.push(data)
})
.on('end', function () {
doSomethingSpecial(all)
})
```
Note that `through2.obj(fn)` is a convenience wrapper around `through2({ objectMode: true }, fn)`.
## API
<b><code>through2([ options, ] [ transformFunction ] [, flushFunction ])</code></b>
Consult the **[stream.Transform](http://nodejs.org/docs/latest/api/stream.html#stream_class_stream_transform)** documentation for the exact rules of the `transformFunction` (i.e. `this._transform`) and the optional `flushFunction` (i.e. `this._flush`).
### options
The options argument is optional and is passed straight through to `stream.Transform`. So you can use `objectMode:true` if you are processing non-binary streams (or just use `through2.obj()`).
The `options` argument is first, unlike standard convention, because if I'm passing in an anonymous function then I'd prefer for the options argument to not get lost at the end of the call:
```js
fs.createReadStream('/tmp/important.dat')
.pipe(through2({ objectMode: true, allowHalfOpen: false },
function (chunk, enc, cb) {
cb(null, 'wut?') // note we can use the second argument on the callback
// to provide data as an alternative to this.push('wut?')
}
)
.pipe(fs.createWriteStream('/tmp/wut.txt'))
```
### transformFunction
The `transformFunction` must have the following signature: `function (chunk, encoding, callback) {}`. A minimal implementation should call the `callback` function to indicate that the transformation is done, even if that transformation means discarding the chunk.
To queue a new chunk, call `this.push(chunk)`&mdash;this can be called as many times as required before the `callback()` if you have multiple pieces to send on.
Alternatively, you may use `callback(err, chunk)` as shorthand for emitting a single chunk or an error.
If you **do not provide a `transformFunction`** then you will get a simple pass-through stream.
### flushFunction
The optional `flushFunction` is provided as the last argument (2nd or 3rd, depending on whether you've supplied options) is called just prior to the stream ending. Can be used to finish up any processing that may be in progress.
```js
fs.createReadStream('/tmp/important.dat')
.pipe(through2(
function (chunk, enc, cb) { cb(null, chunk) }, // transform is a noop
function (cb) { // flush function
this.push('tacking on an extra buffer to the end');
cb();
}
))
.pipe(fs.createWriteStream('/tmp/wut.txt'));
```
<b><code>through2.ctor([ options, ] transformFunction[, flushFunction ])</code></b>
Instead of returning a `stream.Transform` instance, `through2.ctor()` returns a **constructor** for a custom Transform. This is useful when you want to use the same transform logic in multiple instances.
```js
var FToC = through2.ctor({objectMode: true}, function (record, encoding, callback) {
if (record.temp != null && record.unit = "F") {
record.temp = ( ( record.temp - 32 ) * 5 ) / 9
record.unit = "C"
}
this.push(record)
callback()
})
// Create instances of FToC like so:
var converter = new FToC()
// Or:
var converter = FToC()
// Or specify/override options when you instantiate, if you prefer:
var converter = FToC({objectMode: true})
```
## See Also
- [through2-map](https://github.com/brycebaril/through2-map) - Array.prototype.map analog for streams.
- [through2-filter](https://github.com/brycebaril/through2-filter) - Array.prototype.filter analog for streams.
- [through2-reduce](https://github.com/brycebaril/through2-reduce) - Array.prototype.reduce analog for streams.
- [through2-spy](https://github.com/brycebaril/through2-spy) - Wrapper for simple stream.PassThrough spies.
## License
**through2** is Copyright (c) 2013 Rod Vagg [@rvagg](https://twitter.com/rvagg) and licenced under the MIT licence. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE file for more details.

View File

@@ -0,0 +1,5 @@
build/
test/
examples/
fs.js
zlib.js

View File

@@ -0,0 +1,18 @@
Copyright Joyent, Inc. and other Node contributors. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to
deal in the Software without restriction, including without limitation the
rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
sell copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
IN THE SOFTWARE.

View File

@@ -0,0 +1,15 @@
# readable-stream
***Node-core streams for userland***
[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)
[![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/)
This package is a mirror of the Streams2 and Streams3 implementations in Node-core.
If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core.
**readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12.
**readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` youll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when youre ready to start using Streams3, pin to `"~1.1.0"`

View File

@@ -0,0 +1 @@
module.exports = require("./lib/_stream_duplex.js")

View File

@@ -0,0 +1,89 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// a duplex stream is just a stream that is both readable and writable.
// Since JS doesn't have multiple prototypal inheritance, this class
// prototypally inherits from Readable, and then parasitically from
// Writable.
module.exports = Duplex;
/*<replacement>*/
var objectKeys = Object.keys || function (obj) {
var keys = [];
for (var key in obj) keys.push(key);
return keys;
}
/*</replacement>*/
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
var Readable = require('./_stream_readable');
var Writable = require('./_stream_writable');
util.inherits(Duplex, Readable);
forEach(objectKeys(Writable.prototype), function(method) {
if (!Duplex.prototype[method])
Duplex.prototype[method] = Writable.prototype[method];
});
function Duplex(options) {
if (!(this instanceof Duplex))
return new Duplex(options);
Readable.call(this, options);
Writable.call(this, options);
if (options && options.readable === false)
this.readable = false;
if (options && options.writable === false)
this.writable = false;
this.allowHalfOpen = true;
if (options && options.allowHalfOpen === false)
this.allowHalfOpen = false;
this.once('end', onend);
}
// the no-half-open enforcer
function onend() {
// if we allow half-open state, or if the writable side ended,
// then we're ok.
if (this.allowHalfOpen || this._writableState.ended)
return;
// no more data can be written.
// But allow more writes to happen in this tick.
process.nextTick(this.end.bind(this));
}
function forEach (xs, f) {
for (var i = 0, l = xs.length; i < l; i++) {
f(xs[i], i);
}
}

View File

@@ -0,0 +1,46 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// a passthrough stream.
// basically just the most minimal sort of Transform stream.
// Every written chunk gets output as-is.
module.exports = PassThrough;
var Transform = require('./_stream_transform');
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
util.inherits(PassThrough, Transform);
function PassThrough(options) {
if (!(this instanceof PassThrough))
return new PassThrough(options);
Transform.call(this, options);
}
PassThrough.prototype._transform = function(chunk, encoding, cb) {
cb(null, chunk);
};

View File

@@ -0,0 +1,982 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
module.exports = Readable;
/*<replacement>*/
var isArray = require('isarray');
/*</replacement>*/
/*<replacement>*/
var Buffer = require('buffer').Buffer;
/*</replacement>*/
Readable.ReadableState = ReadableState;
var EE = require('events').EventEmitter;
/*<replacement>*/
if (!EE.listenerCount) EE.listenerCount = function(emitter, type) {
return emitter.listeners(type).length;
};
/*</replacement>*/
var Stream = require('stream');
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
var StringDecoder;
util.inherits(Readable, Stream);
function ReadableState(options, stream) {
options = options || {};
// the point at which it stops calling _read() to fill the buffer
// Note: 0 is a valid value, means "don't call _read preemptively ever"
var hwm = options.highWaterMark;
this.highWaterMark = (hwm || hwm === 0) ? hwm : 16 * 1024;
// cast to ints.
this.highWaterMark = ~~this.highWaterMark;
this.buffer = [];
this.length = 0;
this.pipes = null;
this.pipesCount = 0;
this.flowing = false;
this.ended = false;
this.endEmitted = false;
this.reading = false;
// In streams that never have any data, and do push(null) right away,
// the consumer can miss the 'end' event if they do some I/O before
// consuming the stream. So, we don't emit('end') until some reading
// happens.
this.calledRead = false;
// a flag to be able to tell if the onwrite cb is called immediately,
// or on a later tick. We set this to true at first, becuase any
// actions that shouldn't happen until "later" should generally also
// not happen before the first write call.
this.sync = true;
// whenever we return null, then we set a flag to say
// that we're awaiting a 'readable' event emission.
this.needReadable = false;
this.emittedReadable = false;
this.readableListening = false;
// object stream flag. Used to make read(n) ignore n and to
// make all the buffer merging and length checks go away
this.objectMode = !!options.objectMode;
// Crypto is kind of old and crusty. Historically, its default string
// encoding is 'binary' so we have to make this configurable.
// Everything else in the universe uses 'utf8', though.
this.defaultEncoding = options.defaultEncoding || 'utf8';
// when piping, we only care about 'readable' events that happen
// after read()ing all the bytes and not getting any pushback.
this.ranOut = false;
// the number of writers that are awaiting a drain event in .pipe()s
this.awaitDrain = 0;
// if true, a maybeReadMore has been scheduled
this.readingMore = false;
this.decoder = null;
this.encoding = null;
if (options.encoding) {
if (!StringDecoder)
StringDecoder = require('string_decoder/').StringDecoder;
this.decoder = new StringDecoder(options.encoding);
this.encoding = options.encoding;
}
}
function Readable(options) {
if (!(this instanceof Readable))
return new Readable(options);
this._readableState = new ReadableState(options, this);
// legacy
this.readable = true;
Stream.call(this);
}
// Manually shove something into the read() buffer.
// This returns true if the highWaterMark has not been hit yet,
// similar to how Writable.write() returns true if you should
// write() some more.
Readable.prototype.push = function(chunk, encoding) {
var state = this._readableState;
if (typeof chunk === 'string' && !state.objectMode) {
encoding = encoding || state.defaultEncoding;
if (encoding !== state.encoding) {
chunk = new Buffer(chunk, encoding);
encoding = '';
}
}
return readableAddChunk(this, state, chunk, encoding, false);
};
// Unshift should *always* be something directly out of read()
Readable.prototype.unshift = function(chunk) {
var state = this._readableState;
return readableAddChunk(this, state, chunk, '', true);
};
function readableAddChunk(stream, state, chunk, encoding, addToFront) {
var er = chunkInvalid(state, chunk);
if (er) {
stream.emit('error', er);
} else if (chunk === null || chunk === undefined) {
state.reading = false;
if (!state.ended)
onEofChunk(stream, state);
} else if (state.objectMode || chunk && chunk.length > 0) {
if (state.ended && !addToFront) {
var e = new Error('stream.push() after EOF');
stream.emit('error', e);
} else if (state.endEmitted && addToFront) {
var e = new Error('stream.unshift() after end event');
stream.emit('error', e);
} else {
if (state.decoder && !addToFront && !encoding)
chunk = state.decoder.write(chunk);
// update the buffer info.
state.length += state.objectMode ? 1 : chunk.length;
if (addToFront) {
state.buffer.unshift(chunk);
} else {
state.reading = false;
state.buffer.push(chunk);
}
if (state.needReadable)
emitReadable(stream);
maybeReadMore(stream, state);
}
} else if (!addToFront) {
state.reading = false;
}
return needMoreData(state);
}
// if it's past the high water mark, we can push in some more.
// Also, if we have no data yet, we can stand some
// more bytes. This is to work around cases where hwm=0,
// such as the repl. Also, if the push() triggered a
// readable event, and the user called read(largeNumber) such that
// needReadable was set, then we ought to push more, so that another
// 'readable' event will be triggered.
function needMoreData(state) {
return !state.ended &&
(state.needReadable ||
state.length < state.highWaterMark ||
state.length === 0);
}
// backwards compatibility.
Readable.prototype.setEncoding = function(enc) {
if (!StringDecoder)
StringDecoder = require('string_decoder/').StringDecoder;
this._readableState.decoder = new StringDecoder(enc);
this._readableState.encoding = enc;
};
// Don't raise the hwm > 128MB
var MAX_HWM = 0x800000;
function roundUpToNextPowerOf2(n) {
if (n >= MAX_HWM) {
n = MAX_HWM;
} else {
// Get the next highest power of 2
n--;
for (var p = 1; p < 32; p <<= 1) n |= n >> p;
n++;
}
return n;
}
function howMuchToRead(n, state) {
if (state.length === 0 && state.ended)
return 0;
if (state.objectMode)
return n === 0 ? 0 : 1;
if (n === null || isNaN(n)) {
// only flow one buffer at a time
if (state.flowing && state.buffer.length)
return state.buffer[0].length;
else
return state.length;
}
if (n <= 0)
return 0;
// If we're asking for more than the target buffer level,
// then raise the water mark. Bump up to the next highest
// power of 2, to prevent increasing it excessively in tiny
// amounts.
if (n > state.highWaterMark)
state.highWaterMark = roundUpToNextPowerOf2(n);
// don't have that much. return null, unless we've ended.
if (n > state.length) {
if (!state.ended) {
state.needReadable = true;
return 0;
} else
return state.length;
}
return n;
}
// you can override either this method, or the async _read(n) below.
Readable.prototype.read = function(n) {
var state = this._readableState;
state.calledRead = true;
var nOrig = n;
var ret;
if (typeof n !== 'number' || n > 0)
state.emittedReadable = false;
// if we're doing read(0) to trigger a readable event, but we
// already have a bunch of data in the buffer, then just trigger
// the 'readable' event and move on.
if (n === 0 &&
state.needReadable &&
(state.length >= state.highWaterMark || state.ended)) {
emitReadable(this);
return null;
}
n = howMuchToRead(n, state);
// if we've ended, and we're now clear, then finish it up.
if (n === 0 && state.ended) {
ret = null;
// In cases where the decoder did not receive enough data
// to produce a full chunk, then immediately received an
// EOF, state.buffer will contain [<Buffer >, <Buffer 00 ...>].
// howMuchToRead will see this and coerce the amount to
// read to zero (because it's looking at the length of the
// first <Buffer > in state.buffer), and we'll end up here.
//
// This can only happen via state.decoder -- no other venue
// exists for pushing a zero-length chunk into state.buffer
// and triggering this behavior. In this case, we return our
// remaining data and end the stream, if appropriate.
if (state.length > 0 && state.decoder) {
ret = fromList(n, state);
state.length -= ret.length;
}
if (state.length === 0)
endReadable(this);
return ret;
}
// All the actual chunk generation logic needs to be
// *below* the call to _read. The reason is that in certain
// synthetic stream cases, such as passthrough streams, _read
// may be a completely synchronous operation which may change
// the state of the read buffer, providing enough data when
// before there was *not* enough.
//
// So, the steps are:
// 1. Figure out what the state of things will be after we do
// a read from the buffer.
//
// 2. If that resulting state will trigger a _read, then call _read.
// Note that this may be asynchronous, or synchronous. Yes, it is
// deeply ugly to write APIs this way, but that still doesn't mean
// that the Readable class should behave improperly, as streams are
// designed to be sync/async agnostic.
// Take note if the _read call is sync or async (ie, if the read call
// has returned yet), so that we know whether or not it's safe to emit
// 'readable' etc.
//
// 3. Actually pull the requested chunks out of the buffer and return.
// if we need a readable event, then we need to do some reading.
var doRead = state.needReadable;
// if we currently have less than the highWaterMark, then also read some
if (state.length - n <= state.highWaterMark)
doRead = true;
// however, if we've ended, then there's no point, and if we're already
// reading, then it's unnecessary.
if (state.ended || state.reading)
doRead = false;
if (doRead) {
state.reading = true;
state.sync = true;
// if the length is currently zero, then we *need* a readable event.
if (state.length === 0)
state.needReadable = true;
// call internal read method
this._read(state.highWaterMark);
state.sync = false;
}
// If _read called its callback synchronously, then `reading`
// will be false, and we need to re-evaluate how much data we
// can return to the user.
if (doRead && !state.reading)
n = howMuchToRead(nOrig, state);
if (n > 0)
ret = fromList(n, state);
else
ret = null;
if (ret === null) {
state.needReadable = true;
n = 0;
}
state.length -= n;
// If we have nothing in the buffer, then we want to know
// as soon as we *do* get something into the buffer.
if (state.length === 0 && !state.ended)
state.needReadable = true;
// If we happened to read() exactly the remaining amount in the
// buffer, and the EOF has been seen at this point, then make sure
// that we emit 'end' on the very next tick.
if (state.ended && !state.endEmitted && state.length === 0)
endReadable(this);
return ret;
};
function chunkInvalid(state, chunk) {
var er = null;
if (!Buffer.isBuffer(chunk) &&
'string' !== typeof chunk &&
chunk !== null &&
chunk !== undefined &&
!state.objectMode) {
er = new TypeError('Invalid non-string/buffer chunk');
}
return er;
}
function onEofChunk(stream, state) {
if (state.decoder && !state.ended) {
var chunk = state.decoder.end();
if (chunk && chunk.length) {
state.buffer.push(chunk);
state.length += state.objectMode ? 1 : chunk.length;
}
}
state.ended = true;
// if we've ended and we have some data left, then emit
// 'readable' now to make sure it gets picked up.
if (state.length > 0)
emitReadable(stream);
else
endReadable(stream);
}
// Don't emit readable right away in sync mode, because this can trigger
// another read() call => stack overflow. This way, it might trigger
// a nextTick recursion warning, but that's not so bad.
function emitReadable(stream) {
var state = stream._readableState;
state.needReadable = false;
if (state.emittedReadable)
return;
state.emittedReadable = true;
if (state.sync)
process.nextTick(function() {
emitReadable_(stream);
});
else
emitReadable_(stream);
}
function emitReadable_(stream) {
stream.emit('readable');
}
// at this point, the user has presumably seen the 'readable' event,
// and called read() to consume some data. that may have triggered
// in turn another _read(n) call, in which case reading = true if
// it's in progress.
// However, if we're not ended, or reading, and the length < hwm,
// then go ahead and try to read some more preemptively.
function maybeReadMore(stream, state) {
if (!state.readingMore) {
state.readingMore = true;
process.nextTick(function() {
maybeReadMore_(stream, state);
});
}
}
function maybeReadMore_(stream, state) {
var len = state.length;
while (!state.reading && !state.flowing && !state.ended &&
state.length < state.highWaterMark) {
stream.read(0);
if (len === state.length)
// didn't get any data, stop spinning.
break;
else
len = state.length;
}
state.readingMore = false;
}
// abstract method. to be overridden in specific implementation classes.
// call cb(er, data) where data is <= n in length.
// for virtual (non-string, non-buffer) streams, "length" is somewhat
// arbitrary, and perhaps not very meaningful.
Readable.prototype._read = function(n) {
this.emit('error', new Error('not implemented'));
};
Readable.prototype.pipe = function(dest, pipeOpts) {
var src = this;
var state = this._readableState;
switch (state.pipesCount) {
case 0:
state.pipes = dest;
break;
case 1:
state.pipes = [state.pipes, dest];
break;
default:
state.pipes.push(dest);
break;
}
state.pipesCount += 1;
var doEnd = (!pipeOpts || pipeOpts.end !== false) &&
dest !== process.stdout &&
dest !== process.stderr;
var endFn = doEnd ? onend : cleanup;
if (state.endEmitted)
process.nextTick(endFn);
else
src.once('end', endFn);
dest.on('unpipe', onunpipe);
function onunpipe(readable) {
if (readable !== src) return;
cleanup();
}
function onend() {
dest.end();
}
// when the dest drains, it reduces the awaitDrain counter
// on the source. This would be more elegant with a .once()
// handler in flow(), but adding and removing repeatedly is
// too slow.
var ondrain = pipeOnDrain(src);
dest.on('drain', ondrain);
function cleanup() {
// cleanup event handlers once the pipe is broken
dest.removeListener('close', onclose);
dest.removeListener('finish', onfinish);
dest.removeListener('drain', ondrain);
dest.removeListener('error', onerror);
dest.removeListener('unpipe', onunpipe);
src.removeListener('end', onend);
src.removeListener('end', cleanup);
// if the reader is waiting for a drain event from this
// specific writer, then it would cause it to never start
// flowing again.
// So, if this is awaiting a drain, then we just call it now.
// If we don't know, then assume that we are waiting for one.
if (!dest._writableState || dest._writableState.needDrain)
ondrain();
}
// if the dest has an error, then stop piping into it.
// however, don't suppress the throwing behavior for this.
function onerror(er) {
unpipe();
dest.removeListener('error', onerror);
if (EE.listenerCount(dest, 'error') === 0)
dest.emit('error', er);
}
// This is a brutally ugly hack to make sure that our error handler
// is attached before any userland ones. NEVER DO THIS.
if (!dest._events || !dest._events.error)
dest.on('error', onerror);
else if (isArray(dest._events.error))
dest._events.error.unshift(onerror);
else
dest._events.error = [onerror, dest._events.error];
// Both close and finish should trigger unpipe, but only once.
function onclose() {
dest.removeListener('finish', onfinish);
unpipe();
}
dest.once('close', onclose);
function onfinish() {
dest.removeListener('close', onclose);
unpipe();
}
dest.once('finish', onfinish);
function unpipe() {
src.unpipe(dest);
}
// tell the dest that it's being piped to
dest.emit('pipe', src);
// start the flow if it hasn't been started already.
if (!state.flowing) {
// the handler that waits for readable events after all
// the data gets sucked out in flow.
// This would be easier to follow with a .once() handler
// in flow(), but that is too slow.
this.on('readable', pipeOnReadable);
state.flowing = true;
process.nextTick(function() {
flow(src);
});
}
return dest;
};
function pipeOnDrain(src) {
return function() {
var dest = this;
var state = src._readableState;
state.awaitDrain--;
if (state.awaitDrain === 0)
flow(src);
};
}
function flow(src) {
var state = src._readableState;
var chunk;
state.awaitDrain = 0;
function write(dest, i, list) {
var written = dest.write(chunk);
if (false === written) {
state.awaitDrain++;
}
}
while (state.pipesCount && null !== (chunk = src.read())) {
if (state.pipesCount === 1)
write(state.pipes, 0, null);
else
forEach(state.pipes, write);
src.emit('data', chunk);
// if anyone needs a drain, then we have to wait for that.
if (state.awaitDrain > 0)
return;
}
// if every destination was unpiped, either before entering this
// function, or in the while loop, then stop flowing.
//
// NB: This is a pretty rare edge case.
if (state.pipesCount === 0) {
state.flowing = false;
// if there were data event listeners added, then switch to old mode.
if (EE.listenerCount(src, 'data') > 0)
emitDataEvents(src);
return;
}
// at this point, no one needed a drain, so we just ran out of data
// on the next readable event, start it over again.
state.ranOut = true;
}
function pipeOnReadable() {
if (this._readableState.ranOut) {
this._readableState.ranOut = false;
flow(this);
}
}
Readable.prototype.unpipe = function(dest) {
var state = this._readableState;
// if we're not piping anywhere, then do nothing.
if (state.pipesCount === 0)
return this;
// just one destination. most common case.
if (state.pipesCount === 1) {
// passed in one, but it's not the right one.
if (dest && dest !== state.pipes)
return this;
if (!dest)
dest = state.pipes;
// got a match.
state.pipes = null;
state.pipesCount = 0;
this.removeListener('readable', pipeOnReadable);
state.flowing = false;
if (dest)
dest.emit('unpipe', this);
return this;
}
// slow case. multiple pipe destinations.
if (!dest) {
// remove all.
var dests = state.pipes;
var len = state.pipesCount;
state.pipes = null;
state.pipesCount = 0;
this.removeListener('readable', pipeOnReadable);
state.flowing = false;
for (var i = 0; i < len; i++)
dests[i].emit('unpipe', this);
return this;
}
// try to find the right one.
var i = indexOf(state.pipes, dest);
if (i === -1)
return this;
state.pipes.splice(i, 1);
state.pipesCount -= 1;
if (state.pipesCount === 1)
state.pipes = state.pipes[0];
dest.emit('unpipe', this);
return this;
};
// set up data events if they are asked for
// Ensure readable listeners eventually get something
Readable.prototype.on = function(ev, fn) {
var res = Stream.prototype.on.call(this, ev, fn);
if (ev === 'data' && !this._readableState.flowing)
emitDataEvents(this);
if (ev === 'readable' && this.readable) {
var state = this._readableState;
if (!state.readableListening) {
state.readableListening = true;
state.emittedReadable = false;
state.needReadable = true;
if (!state.reading) {
this.read(0);
} else if (state.length) {
emitReadable(this, state);
}
}
}
return res;
};
Readable.prototype.addListener = Readable.prototype.on;
// pause() and resume() are remnants of the legacy readable stream API
// If the user uses them, then switch into old mode.
Readable.prototype.resume = function() {
emitDataEvents(this);
this.read(0);
this.emit('resume');
};
Readable.prototype.pause = function() {
emitDataEvents(this, true);
this.emit('pause');
};
function emitDataEvents(stream, startPaused) {
var state = stream._readableState;
if (state.flowing) {
// https://github.com/isaacs/readable-stream/issues/16
throw new Error('Cannot switch to old mode now.');
}
var paused = startPaused || false;
var readable = false;
// convert to an old-style stream.
stream.readable = true;
stream.pipe = Stream.prototype.pipe;
stream.on = stream.addListener = Stream.prototype.on;
stream.on('readable', function() {
readable = true;
var c;
while (!paused && (null !== (c = stream.read())))
stream.emit('data', c);
if (c === null) {
readable = false;
stream._readableState.needReadable = true;
}
});
stream.pause = function() {
paused = true;
this.emit('pause');
};
stream.resume = function() {
paused = false;
if (readable)
process.nextTick(function() {
stream.emit('readable');
});
else
this.read(0);
this.emit('resume');
};
// now make it start, just in case it hadn't already.
stream.emit('readable');
}
// wrap an old-style stream as the async data source.
// This is *not* part of the readable stream interface.
// It is an ugly unfortunate mess of history.
Readable.prototype.wrap = function(stream) {
var state = this._readableState;
var paused = false;
var self = this;
stream.on('end', function() {
if (state.decoder && !state.ended) {
var chunk = state.decoder.end();
if (chunk && chunk.length)
self.push(chunk);
}
self.push(null);
});
stream.on('data', function(chunk) {
if (state.decoder)
chunk = state.decoder.write(chunk);
// don't skip over falsy values in objectMode
//if (state.objectMode && util.isNullOrUndefined(chunk))
if (state.objectMode && (chunk === null || chunk === undefined))
return;
else if (!state.objectMode && (!chunk || !chunk.length))
return;
var ret = self.push(chunk);
if (!ret) {
paused = true;
stream.pause();
}
});
// proxy all the other methods.
// important when wrapping filters and duplexes.
for (var i in stream) {
if (typeof stream[i] === 'function' &&
typeof this[i] === 'undefined') {
this[i] = function(method) { return function() {
return stream[method].apply(stream, arguments);
}}(i);
}
}
// proxy certain important events.
var events = ['error', 'close', 'destroy', 'pause', 'resume'];
forEach(events, function(ev) {
stream.on(ev, self.emit.bind(self, ev));
});
// when we try to consume some more bytes, simply unpause the
// underlying stream.
self._read = function(n) {
if (paused) {
paused = false;
stream.resume();
}
};
return self;
};
// exposed for testing purposes only.
Readable._fromList = fromList;
// Pluck off n bytes from an array of buffers.
// Length is the combined lengths of all the buffers in the list.
function fromList(n, state) {
var list = state.buffer;
var length = state.length;
var stringMode = !!state.decoder;
var objectMode = !!state.objectMode;
var ret;
// nothing in the list, definitely empty.
if (list.length === 0)
return null;
if (length === 0)
ret = null;
else if (objectMode)
ret = list.shift();
else if (!n || n >= length) {
// read it all, truncate the array.
if (stringMode)
ret = list.join('');
else
ret = Buffer.concat(list, length);
list.length = 0;
} else {
// read just some of it.
if (n < list[0].length) {
// just take a part of the first list item.
// slice is the same for buffers and strings.
var buf = list[0];
ret = buf.slice(0, n);
list[0] = buf.slice(n);
} else if (n === list[0].length) {
// first list is a perfect match
ret = list.shift();
} else {
// complex case.
// we have enough to cover it, but it spans past the first buffer.
if (stringMode)
ret = '';
else
ret = new Buffer(n);
var c = 0;
for (var i = 0, l = list.length; i < l && c < n; i++) {
var buf = list[0];
var cpy = Math.min(n - c, buf.length);
if (stringMode)
ret += buf.slice(0, cpy);
else
buf.copy(ret, c, 0, cpy);
if (cpy < buf.length)
list[0] = buf.slice(cpy);
else
list.shift();
c += cpy;
}
}
}
return ret;
}
function endReadable(stream) {
var state = stream._readableState;
// If we get here before consuming all the bytes, then that is a
// bug in node. Should never happen.
if (state.length > 0)
throw new Error('endReadable called on non-empty stream');
if (!state.endEmitted && state.calledRead) {
state.ended = true;
process.nextTick(function() {
// Check that we didn't get one last unshift.
if (!state.endEmitted && state.length === 0) {
state.endEmitted = true;
stream.readable = false;
stream.emit('end');
}
});
}
}
function forEach (xs, f) {
for (var i = 0, l = xs.length; i < l; i++) {
f(xs[i], i);
}
}
function indexOf (xs, x) {
for (var i = 0, l = xs.length; i < l; i++) {
if (xs[i] === x) return i;
}
return -1;
}

View File

@@ -0,0 +1,210 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// a transform stream is a readable/writable stream where you do
// something with the data. Sometimes it's called a "filter",
// but that's not a great name for it, since that implies a thing where
// some bits pass through, and others are simply ignored. (That would
// be a valid example of a transform, of course.)
//
// While the output is causally related to the input, it's not a
// necessarily symmetric or synchronous transformation. For example,
// a zlib stream might take multiple plain-text writes(), and then
// emit a single compressed chunk some time in the future.
//
// Here's how this works:
//
// The Transform stream has all the aspects of the readable and writable
// stream classes. When you write(chunk), that calls _write(chunk,cb)
// internally, and returns false if there's a lot of pending writes
// buffered up. When you call read(), that calls _read(n) until
// there's enough pending readable data buffered up.
//
// In a transform stream, the written data is placed in a buffer. When
// _read(n) is called, it transforms the queued up data, calling the
// buffered _write cb's as it consumes chunks. If consuming a single
// written chunk would result in multiple output chunks, then the first
// outputted bit calls the readcb, and subsequent chunks just go into
// the read buffer, and will cause it to emit 'readable' if necessary.
//
// This way, back-pressure is actually determined by the reading side,
// since _read has to be called to start processing a new chunk. However,
// a pathological inflate type of transform can cause excessive buffering
// here. For example, imagine a stream where every byte of input is
// interpreted as an integer from 0-255, and then results in that many
// bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in
// 1kb of data being output. In this case, you could write a very small
// amount of input, and end up with a very large amount of output. In
// such a pathological inflating mechanism, there'd be no way to tell
// the system to stop doing the transform. A single 4MB write could
// cause the system to run out of memory.
//
// However, even in such a pathological case, only a single written chunk
// would be consumed, and then the rest would wait (un-transformed) until
// the results of the previous transformed chunk were consumed.
module.exports = Transform;
var Duplex = require('./_stream_duplex');
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
util.inherits(Transform, Duplex);
function TransformState(options, stream) {
this.afterTransform = function(er, data) {
return afterTransform(stream, er, data);
};
this.needTransform = false;
this.transforming = false;
this.writecb = null;
this.writechunk = null;
}
function afterTransform(stream, er, data) {
var ts = stream._transformState;
ts.transforming = false;
var cb = ts.writecb;
if (!cb)
return stream.emit('error', new Error('no writecb in Transform class'));
ts.writechunk = null;
ts.writecb = null;
if (data !== null && data !== undefined)
stream.push(data);
if (cb)
cb(er);
var rs = stream._readableState;
rs.reading = false;
if (rs.needReadable || rs.length < rs.highWaterMark) {
stream._read(rs.highWaterMark);
}
}
function Transform(options) {
if (!(this instanceof Transform))
return new Transform(options);
Duplex.call(this, options);
var ts = this._transformState = new TransformState(options, this);
// when the writable side finishes, then flush out anything remaining.
var stream = this;
// start out asking for a readable event once data is transformed.
this._readableState.needReadable = true;
// we have implemented the _read method, and done the other things
// that Readable wants before the first _read call, so unset the
// sync guard flag.
this._readableState.sync = false;
this.once('finish', function() {
if ('function' === typeof this._flush)
this._flush(function(er) {
done(stream, er);
});
else
done(stream);
});
}
Transform.prototype.push = function(chunk, encoding) {
this._transformState.needTransform = false;
return Duplex.prototype.push.call(this, chunk, encoding);
};
// This is the part where you do stuff!
// override this function in implementation classes.
// 'chunk' is an input chunk.
//
// Call `push(newChunk)` to pass along transformed output
// to the readable side. You may call 'push' zero or more times.
//
// Call `cb(err)` when you are done with this chunk. If you pass
// an error, then that'll put the hurt on the whole operation. If you
// never call cb(), then you'll never get another chunk.
Transform.prototype._transform = function(chunk, encoding, cb) {
throw new Error('not implemented');
};
Transform.prototype._write = function(chunk, encoding, cb) {
var ts = this._transformState;
ts.writecb = cb;
ts.writechunk = chunk;
ts.writeencoding = encoding;
if (!ts.transforming) {
var rs = this._readableState;
if (ts.needTransform ||
rs.needReadable ||
rs.length < rs.highWaterMark)
this._read(rs.highWaterMark);
}
};
// Doesn't matter what the args are here.
// _transform does all the work.
// That we got here means that the readable side wants more data.
Transform.prototype._read = function(n) {
var ts = this._transformState;
if (ts.writechunk !== null && ts.writecb && !ts.transforming) {
ts.transforming = true;
this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform);
} else {
// mark that we need a transform, so that any data that comes in
// will get processed, now that we've asked for it.
ts.needTransform = true;
}
};
function done(stream, er) {
if (er)
return stream.emit('error', er);
// if there's nothing in the write buffer, then that means
// that nothing more will ever be provided
var ws = stream._writableState;
var rs = stream._readableState;
var ts = stream._transformState;
if (ws.length)
throw new Error('calling transform done when ws.length != 0');
if (ts.transforming)
throw new Error('calling transform done when still transforming');
return stream.push(null);
}

View File

@@ -0,0 +1,386 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// A bit simpler than readable streams.
// Implement an async ._write(chunk, cb), and it'll handle all
// the drain event emission and buffering.
module.exports = Writable;
/*<replacement>*/
var Buffer = require('buffer').Buffer;
/*</replacement>*/
Writable.WritableState = WritableState;
/*<replacement>*/
var util = require('core-util-is');
util.inherits = require('inherits');
/*</replacement>*/
var Stream = require('stream');
util.inherits(Writable, Stream);
function WriteReq(chunk, encoding, cb) {
this.chunk = chunk;
this.encoding = encoding;
this.callback = cb;
}
function WritableState(options, stream) {
options = options || {};
// the point at which write() starts returning false
// Note: 0 is a valid value, means that we always return false if
// the entire buffer is not flushed immediately on write()
var hwm = options.highWaterMark;
this.highWaterMark = (hwm || hwm === 0) ? hwm : 16 * 1024;
// object stream flag to indicate whether or not this stream
// contains buffers or objects.
this.objectMode = !!options.objectMode;
// cast to ints.
this.highWaterMark = ~~this.highWaterMark;
this.needDrain = false;
// at the start of calling end()
this.ending = false;
// when end() has been called, and returned
this.ended = false;
// when 'finish' is emitted
this.finished = false;
// should we decode strings into buffers before passing to _write?
// this is here so that some node-core streams can optimize string
// handling at a lower level.
var noDecode = options.decodeStrings === false;
this.decodeStrings = !noDecode;
// Crypto is kind of old and crusty. Historically, its default string
// encoding is 'binary' so we have to make this configurable.
// Everything else in the universe uses 'utf8', though.
this.defaultEncoding = options.defaultEncoding || 'utf8';
// not an actual buffer we keep track of, but a measurement
// of how much we're waiting to get pushed to some underlying
// socket or file.
this.length = 0;
// a flag to see when we're in the middle of a write.
this.writing = false;
// a flag to be able to tell if the onwrite cb is called immediately,
// or on a later tick. We set this to true at first, becuase any
// actions that shouldn't happen until "later" should generally also
// not happen before the first write call.
this.sync = true;
// a flag to know if we're processing previously buffered items, which
// may call the _write() callback in the same tick, so that we don't
// end up in an overlapped onwrite situation.
this.bufferProcessing = false;
// the callback that's passed to _write(chunk,cb)
this.onwrite = function(er) {
onwrite(stream, er);
};
// the callback that the user supplies to write(chunk,encoding,cb)
this.writecb = null;
// the amount that is being written when _write is called.
this.writelen = 0;
this.buffer = [];
// True if the error was already emitted and should not be thrown again
this.errorEmitted = false;
}
function Writable(options) {
var Duplex = require('./_stream_duplex');
// Writable ctor is applied to Duplexes, though they're not
// instanceof Writable, they're instanceof Readable.
if (!(this instanceof Writable) && !(this instanceof Duplex))
return new Writable(options);
this._writableState = new WritableState(options, this);
// legacy.
this.writable = true;
Stream.call(this);
}
// Otherwise people can pipe Writable streams, which is just wrong.
Writable.prototype.pipe = function() {
this.emit('error', new Error('Cannot pipe. Not readable.'));
};
function writeAfterEnd(stream, state, cb) {
var er = new Error('write after end');
// TODO: defer error events consistently everywhere, not just the cb
stream.emit('error', er);
process.nextTick(function() {
cb(er);
});
}
// If we get something that is not a buffer, string, null, or undefined,
// and we're not in objectMode, then that's an error.
// Otherwise stream chunks are all considered to be of length=1, and the
// watermarks determine how many objects to keep in the buffer, rather than
// how many bytes or characters.
function validChunk(stream, state, chunk, cb) {
var valid = true;
if (!Buffer.isBuffer(chunk) &&
'string' !== typeof chunk &&
chunk !== null &&
chunk !== undefined &&
!state.objectMode) {
var er = new TypeError('Invalid non-string/buffer chunk');
stream.emit('error', er);
process.nextTick(function() {
cb(er);
});
valid = false;
}
return valid;
}
Writable.prototype.write = function(chunk, encoding, cb) {
var state = this._writableState;
var ret = false;
if (typeof encoding === 'function') {
cb = encoding;
encoding = null;
}
if (Buffer.isBuffer(chunk))
encoding = 'buffer';
else if (!encoding)
encoding = state.defaultEncoding;
if (typeof cb !== 'function')
cb = function() {};
if (state.ended)
writeAfterEnd(this, state, cb);
else if (validChunk(this, state, chunk, cb))
ret = writeOrBuffer(this, state, chunk, encoding, cb);
return ret;
};
function decodeChunk(state, chunk, encoding) {
if (!state.objectMode &&
state.decodeStrings !== false &&
typeof chunk === 'string') {
chunk = new Buffer(chunk, encoding);
}
return chunk;
}
// if we're already writing something, then just put this
// in the queue, and wait our turn. Otherwise, call _write
// If we return false, then we need a drain event, so set that flag.
function writeOrBuffer(stream, state, chunk, encoding, cb) {
chunk = decodeChunk(state, chunk, encoding);
if (Buffer.isBuffer(chunk))
encoding = 'buffer';
var len = state.objectMode ? 1 : chunk.length;
state.length += len;
var ret = state.length < state.highWaterMark;
// we must ensure that previous needDrain will not be reset to false.
if (!ret)
state.needDrain = true;
if (state.writing)
state.buffer.push(new WriteReq(chunk, encoding, cb));
else
doWrite(stream, state, len, chunk, encoding, cb);
return ret;
}
function doWrite(stream, state, len, chunk, encoding, cb) {
state.writelen = len;
state.writecb = cb;
state.writing = true;
state.sync = true;
stream._write(chunk, encoding, state.onwrite);
state.sync = false;
}
function onwriteError(stream, state, sync, er, cb) {
if (sync)
process.nextTick(function() {
cb(er);
});
else
cb(er);
stream._writableState.errorEmitted = true;
stream.emit('error', er);
}
function onwriteStateUpdate(state) {
state.writing = false;
state.writecb = null;
state.length -= state.writelen;
state.writelen = 0;
}
function onwrite(stream, er) {
var state = stream._writableState;
var sync = state.sync;
var cb = state.writecb;
onwriteStateUpdate(state);
if (er)
onwriteError(stream, state, sync, er, cb);
else {
// Check if we're actually ready to finish, but don't emit yet
var finished = needFinish(stream, state);
if (!finished && !state.bufferProcessing && state.buffer.length)
clearBuffer(stream, state);
if (sync) {
process.nextTick(function() {
afterWrite(stream, state, finished, cb);
});
} else {
afterWrite(stream, state, finished, cb);
}
}
}
function afterWrite(stream, state, finished, cb) {
if (!finished)
onwriteDrain(stream, state);
cb();
if (finished)
finishMaybe(stream, state);
}
// Must force callback to be called on nextTick, so that we don't
// emit 'drain' before the write() consumer gets the 'false' return
// value, and has a chance to attach a 'drain' listener.
function onwriteDrain(stream, state) {
if (state.length === 0 && state.needDrain) {
state.needDrain = false;
stream.emit('drain');
}
}
// if there's something in the buffer waiting, then process it
function clearBuffer(stream, state) {
state.bufferProcessing = true;
for (var c = 0; c < state.buffer.length; c++) {
var entry = state.buffer[c];
var chunk = entry.chunk;
var encoding = entry.encoding;
var cb = entry.callback;
var len = state.objectMode ? 1 : chunk.length;
doWrite(stream, state, len, chunk, encoding, cb);
// if we didn't call the onwrite immediately, then
// it means that we need to wait until it does.
// also, that means that the chunk and cb are currently
// being processed, so move the buffer counter past them.
if (state.writing) {
c++;
break;
}
}
state.bufferProcessing = false;
if (c < state.buffer.length)
state.buffer = state.buffer.slice(c);
else
state.buffer.length = 0;
}
Writable.prototype._write = function(chunk, encoding, cb) {
cb(new Error('not implemented'));
};
Writable.prototype.end = function(chunk, encoding, cb) {
var state = this._writableState;
if (typeof chunk === 'function') {
cb = chunk;
chunk = null;
encoding = null;
} else if (typeof encoding === 'function') {
cb = encoding;
encoding = null;
}
if (typeof chunk !== 'undefined' && chunk !== null)
this.write(chunk, encoding);
// ignore unnecessary end() calls.
if (!state.ending && !state.finished)
endWritable(this, state, cb);
};
function needFinish(stream, state) {
return (state.ending &&
state.length === 0 &&
!state.finished &&
!state.writing);
}
function finishMaybe(stream, state) {
var need = needFinish(stream, state);
if (need) {
state.finished = true;
stream.emit('finish');
}
return need;
}
function endWritable(stream, state, cb) {
state.ending = true;
finishMaybe(stream, state);
if (cb) {
if (state.finished)
process.nextTick(cb);
else
stream.once('finish', cb);
}
state.ended = true;
}

View File

@@ -0,0 +1,3 @@
# core-util-is
The `util.is*` functions introduced in Node v0.12.

View File

@@ -0,0 +1,604 @@
diff --git a/lib/util.js b/lib/util.js
index a03e874..9074e8e 100644
--- a/lib/util.js
+++ b/lib/util.js
@@ -19,430 +19,6 @@
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
-var formatRegExp = /%[sdj%]/g;
-exports.format = function(f) {
- if (!isString(f)) {
- var objects = [];
- for (var i = 0; i < arguments.length; i++) {
- objects.push(inspect(arguments[i]));
- }
- return objects.join(' ');
- }
-
- var i = 1;
- var args = arguments;
- var len = args.length;
- var str = String(f).replace(formatRegExp, function(x) {
- if (x === '%%') return '%';
- if (i >= len) return x;
- switch (x) {
- case '%s': return String(args[i++]);
- case '%d': return Number(args[i++]);
- case '%j':
- try {
- return JSON.stringify(args[i++]);
- } catch (_) {
- return '[Circular]';
- }
- default:
- return x;
- }
- });
- for (var x = args[i]; i < len; x = args[++i]) {
- if (isNull(x) || !isObject(x)) {
- str += ' ' + x;
- } else {
- str += ' ' + inspect(x);
- }
- }
- return str;
-};
-
-
-// Mark that a method should not be used.
-// Returns a modified function which warns once by default.
-// If --no-deprecation is set, then it is a no-op.
-exports.deprecate = function(fn, msg) {
- // Allow for deprecating things in the process of starting up.
- if (isUndefined(global.process)) {
- return function() {
- return exports.deprecate(fn, msg).apply(this, arguments);
- };
- }
-
- if (process.noDeprecation === true) {
- return fn;
- }
-
- var warned = false;
- function deprecated() {
- if (!warned) {
- if (process.throwDeprecation) {
- throw new Error(msg);
- } else if (process.traceDeprecation) {
- console.trace(msg);
- } else {
- console.error(msg);
- }
- warned = true;
- }
- return fn.apply(this, arguments);
- }
-
- return deprecated;
-};
-
-
-var debugs = {};
-var debugEnviron;
-exports.debuglog = function(set) {
- if (isUndefined(debugEnviron))
- debugEnviron = process.env.NODE_DEBUG || '';
- set = set.toUpperCase();
- if (!debugs[set]) {
- if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) {
- var pid = process.pid;
- debugs[set] = function() {
- var msg = exports.format.apply(exports, arguments);
- console.error('%s %d: %s', set, pid, msg);
- };
- } else {
- debugs[set] = function() {};
- }
- }
- return debugs[set];
-};
-
-
-/**
- * Echos the value of a value. Trys to print the value out
- * in the best way possible given the different types.
- *
- * @param {Object} obj The object to print out.
- * @param {Object} opts Optional options object that alters the output.
- */
-/* legacy: obj, showHidden, depth, colors*/
-function inspect(obj, opts) {
- // default options
- var ctx = {
- seen: [],
- stylize: stylizeNoColor
- };
- // legacy...
- if (arguments.length >= 3) ctx.depth = arguments[2];
- if (arguments.length >= 4) ctx.colors = arguments[3];
- if (isBoolean(opts)) {
- // legacy...
- ctx.showHidden = opts;
- } else if (opts) {
- // got an "options" object
- exports._extend(ctx, opts);
- }
- // set default options
- if (isUndefined(ctx.showHidden)) ctx.showHidden = false;
- if (isUndefined(ctx.depth)) ctx.depth = 2;
- if (isUndefined(ctx.colors)) ctx.colors = false;
- if (isUndefined(ctx.customInspect)) ctx.customInspect = true;
- if (ctx.colors) ctx.stylize = stylizeWithColor;
- return formatValue(ctx, obj, ctx.depth);
-}
-exports.inspect = inspect;
-
-
-// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics
-inspect.colors = {
- 'bold' : [1, 22],
- 'italic' : [3, 23],
- 'underline' : [4, 24],
- 'inverse' : [7, 27],
- 'white' : [37, 39],
- 'grey' : [90, 39],
- 'black' : [30, 39],
- 'blue' : [34, 39],
- 'cyan' : [36, 39],
- 'green' : [32, 39],
- 'magenta' : [35, 39],
- 'red' : [31, 39],
- 'yellow' : [33, 39]
-};
-
-// Don't use 'blue' not visible on cmd.exe
-inspect.styles = {
- 'special': 'cyan',
- 'number': 'yellow',
- 'boolean': 'yellow',
- 'undefined': 'grey',
- 'null': 'bold',
- 'string': 'green',
- 'date': 'magenta',
- // "name": intentionally not styling
- 'regexp': 'red'
-};
-
-
-function stylizeWithColor(str, styleType) {
- var style = inspect.styles[styleType];
-
- if (style) {
- return '\u001b[' + inspect.colors[style][0] + 'm' + str +
- '\u001b[' + inspect.colors[style][1] + 'm';
- } else {
- return str;
- }
-}
-
-
-function stylizeNoColor(str, styleType) {
- return str;
-}
-
-
-function arrayToHash(array) {
- var hash = {};
-
- array.forEach(function(val, idx) {
- hash[val] = true;
- });
-
- return hash;
-}
-
-
-function formatValue(ctx, value, recurseTimes) {
- // Provide a hook for user-specified inspect functions.
- // Check that value is an object with an inspect function on it
- if (ctx.customInspect &&
- value &&
- isFunction(value.inspect) &&
- // Filter out the util module, it's inspect function is special
- value.inspect !== exports.inspect &&
- // Also filter out any prototype objects using the circular check.
- !(value.constructor && value.constructor.prototype === value)) {
- var ret = value.inspect(recurseTimes, ctx);
- if (!isString(ret)) {
- ret = formatValue(ctx, ret, recurseTimes);
- }
- return ret;
- }
-
- // Primitive types cannot have properties
- var primitive = formatPrimitive(ctx, value);
- if (primitive) {
- return primitive;
- }
-
- // Look up the keys of the object.
- var keys = Object.keys(value);
- var visibleKeys = arrayToHash(keys);
-
- if (ctx.showHidden) {
- keys = Object.getOwnPropertyNames(value);
- }
-
- // Some type of object without properties can be shortcutted.
- if (keys.length === 0) {
- if (isFunction(value)) {
- var name = value.name ? ': ' + value.name : '';
- return ctx.stylize('[Function' + name + ']', 'special');
- }
- if (isRegExp(value)) {
- return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp');
- }
- if (isDate(value)) {
- return ctx.stylize(Date.prototype.toString.call(value), 'date');
- }
- if (isError(value)) {
- return formatError(value);
- }
- }
-
- var base = '', array = false, braces = ['{', '}'];
-
- // Make Array say that they are Array
- if (isArray(value)) {
- array = true;
- braces = ['[', ']'];
- }
-
- // Make functions say that they are functions
- if (isFunction(value)) {
- var n = value.name ? ': ' + value.name : '';
- base = ' [Function' + n + ']';
- }
-
- // Make RegExps say that they are RegExps
- if (isRegExp(value)) {
- base = ' ' + RegExp.prototype.toString.call(value);
- }
-
- // Make dates with properties first say the date
- if (isDate(value)) {
- base = ' ' + Date.prototype.toUTCString.call(value);
- }
-
- // Make error with message first say the error
- if (isError(value)) {
- base = ' ' + formatError(value);
- }
-
- if (keys.length === 0 && (!array || value.length == 0)) {
- return braces[0] + base + braces[1];
- }
-
- if (recurseTimes < 0) {
- if (isRegExp(value)) {
- return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp');
- } else {
- return ctx.stylize('[Object]', 'special');
- }
- }
-
- ctx.seen.push(value);
-
- var output;
- if (array) {
- output = formatArray(ctx, value, recurseTimes, visibleKeys, keys);
- } else {
- output = keys.map(function(key) {
- return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array);
- });
- }
-
- ctx.seen.pop();
-
- return reduceToSingleString(output, base, braces);
-}
-
-
-function formatPrimitive(ctx, value) {
- if (isUndefined(value))
- return ctx.stylize('undefined', 'undefined');
- if (isString(value)) {
- var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '')
- .replace(/'/g, "\\'")
- .replace(/\\"/g, '"') + '\'';
- return ctx.stylize(simple, 'string');
- }
- if (isNumber(value)) {
- // Format -0 as '-0'. Strict equality won't distinguish 0 from -0,
- // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 .
- if (value === 0 && 1 / value < 0)
- return ctx.stylize('-0', 'number');
- return ctx.stylize('' + value, 'number');
- }
- if (isBoolean(value))
- return ctx.stylize('' + value, 'boolean');
- // For some reason typeof null is "object", so special case here.
- if (isNull(value))
- return ctx.stylize('null', 'null');
-}
-
-
-function formatError(value) {
- return '[' + Error.prototype.toString.call(value) + ']';
-}
-
-
-function formatArray(ctx, value, recurseTimes, visibleKeys, keys) {
- var output = [];
- for (var i = 0, l = value.length; i < l; ++i) {
- if (hasOwnProperty(value, String(i))) {
- output.push(formatProperty(ctx, value, recurseTimes, visibleKeys,
- String(i), true));
- } else {
- output.push('');
- }
- }
- keys.forEach(function(key) {
- if (!key.match(/^\d+$/)) {
- output.push(formatProperty(ctx, value, recurseTimes, visibleKeys,
- key, true));
- }
- });
- return output;
-}
-
-
-function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) {
- var name, str, desc;
- desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] };
- if (desc.get) {
- if (desc.set) {
- str = ctx.stylize('[Getter/Setter]', 'special');
- } else {
- str = ctx.stylize('[Getter]', 'special');
- }
- } else {
- if (desc.set) {
- str = ctx.stylize('[Setter]', 'special');
- }
- }
- if (!hasOwnProperty(visibleKeys, key)) {
- name = '[' + key + ']';
- }
- if (!str) {
- if (ctx.seen.indexOf(desc.value) < 0) {
- if (isNull(recurseTimes)) {
- str = formatValue(ctx, desc.value, null);
- } else {
- str = formatValue(ctx, desc.value, recurseTimes - 1);
- }
- if (str.indexOf('\n') > -1) {
- if (array) {
- str = str.split('\n').map(function(line) {
- return ' ' + line;
- }).join('\n').substr(2);
- } else {
- str = '\n' + str.split('\n').map(function(line) {
- return ' ' + line;
- }).join('\n');
- }
- }
- } else {
- str = ctx.stylize('[Circular]', 'special');
- }
- }
- if (isUndefined(name)) {
- if (array && key.match(/^\d+$/)) {
- return str;
- }
- name = JSON.stringify('' + key);
- if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) {
- name = name.substr(1, name.length - 2);
- name = ctx.stylize(name, 'name');
- } else {
- name = name.replace(/'/g, "\\'")
- .replace(/\\"/g, '"')
- .replace(/(^"|"$)/g, "'");
- name = ctx.stylize(name, 'string');
- }
- }
-
- return name + ': ' + str;
-}
-
-
-function reduceToSingleString(output, base, braces) {
- var numLinesEst = 0;
- var length = output.reduce(function(prev, cur) {
- numLinesEst++;
- if (cur.indexOf('\n') >= 0) numLinesEst++;
- return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1;
- }, 0);
-
- if (length > 60) {
- return braces[0] +
- (base === '' ? '' : base + '\n ') +
- ' ' +
- output.join(',\n ') +
- ' ' +
- braces[1];
- }
-
- return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1];
-}
-
-
// NOTE: These type checking functions intentionally don't use `instanceof`
// because it is fragile and can be easily faked with `Object.create()`.
function isArray(ar) {
@@ -522,166 +98,10 @@ function isPrimitive(arg) {
exports.isPrimitive = isPrimitive;
function isBuffer(arg) {
- return arg instanceof Buffer;
+ return Buffer.isBuffer(arg);
}
exports.isBuffer = isBuffer;
function objectToString(o) {
return Object.prototype.toString.call(o);
-}
-
-
-function pad(n) {
- return n < 10 ? '0' + n.toString(10) : n.toString(10);
-}
-
-
-var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep',
- 'Oct', 'Nov', 'Dec'];
-
-// 26 Feb 16:19:34
-function timestamp() {
- var d = new Date();
- var time = [pad(d.getHours()),
- pad(d.getMinutes()),
- pad(d.getSeconds())].join(':');
- return [d.getDate(), months[d.getMonth()], time].join(' ');
-}
-
-
-// log is just a thin wrapper to console.log that prepends a timestamp
-exports.log = function() {
- console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments));
-};
-
-
-/**
- * Inherit the prototype methods from one constructor into another.
- *
- * The Function.prototype.inherits from lang.js rewritten as a standalone
- * function (not on Function.prototype). NOTE: If this file is to be loaded
- * during bootstrapping this function needs to be rewritten using some native
- * functions as prototype setup using normal JavaScript does not work as
- * expected during bootstrapping (see mirror.js in r114903).
- *
- * @param {function} ctor Constructor function which needs to inherit the
- * prototype.
- * @param {function} superCtor Constructor function to inherit prototype from.
- */
-exports.inherits = function(ctor, superCtor) {
- ctor.super_ = superCtor;
- ctor.prototype = Object.create(superCtor.prototype, {
- constructor: {
- value: ctor,
- enumerable: false,
- writable: true,
- configurable: true
- }
- });
-};
-
-exports._extend = function(origin, add) {
- // Don't do anything if add isn't an object
- if (!add || !isObject(add)) return origin;
-
- var keys = Object.keys(add);
- var i = keys.length;
- while (i--) {
- origin[keys[i]] = add[keys[i]];
- }
- return origin;
-};
-
-function hasOwnProperty(obj, prop) {
- return Object.prototype.hasOwnProperty.call(obj, prop);
-}
-
-
-// Deprecated old stuff.
-
-exports.p = exports.deprecate(function() {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- console.error(exports.inspect(arguments[i]));
- }
-}, 'util.p: Use console.error() instead');
-
-
-exports.exec = exports.deprecate(function() {
- return require('child_process').exec.apply(this, arguments);
-}, 'util.exec is now called `child_process.exec`.');
-
-
-exports.print = exports.deprecate(function() {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- process.stdout.write(String(arguments[i]));
- }
-}, 'util.print: Use console.log instead');
-
-
-exports.puts = exports.deprecate(function() {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- process.stdout.write(arguments[i] + '\n');
- }
-}, 'util.puts: Use console.log instead');
-
-
-exports.debug = exports.deprecate(function(x) {
- process.stderr.write('DEBUG: ' + x + '\n');
-}, 'util.debug: Use console.error instead');
-
-
-exports.error = exports.deprecate(function(x) {
- for (var i = 0, len = arguments.length; i < len; ++i) {
- process.stderr.write(arguments[i] + '\n');
- }
-}, 'util.error: Use console.error instead');
-
-
-exports.pump = exports.deprecate(function(readStream, writeStream, callback) {
- var callbackCalled = false;
-
- function call(a, b, c) {
- if (callback && !callbackCalled) {
- callback(a, b, c);
- callbackCalled = true;
- }
- }
-
- readStream.addListener('data', function(chunk) {
- if (writeStream.write(chunk) === false) readStream.pause();
- });
-
- writeStream.addListener('drain', function() {
- readStream.resume();
- });
-
- readStream.addListener('end', function() {
- writeStream.end();
- });
-
- readStream.addListener('close', function() {
- call();
- });
-
- readStream.addListener('error', function(err) {
- writeStream.end();
- call(err);
- });
-
- writeStream.addListener('error', function(err) {
- readStream.destroy();
- call(err);
- });
-}, 'util.pump(): Use readableStream.pipe() instead');
-
-
-var uv;
-exports._errnoException = function(err, syscall) {
- if (isUndefined(uv)) uv = process.binding('uv');
- var errname = uv.errname(err);
- var e = new Error(syscall + ' ' + errname);
- e.code = errname;
- e.errno = errname;
- e.syscall = syscall;
- return e;
-};
+}

View File

@@ -0,0 +1,107 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// NOTE: These type checking functions intentionally don't use `instanceof`
// because it is fragile and can be easily faked with `Object.create()`.
function isArray(ar) {
return Array.isArray(ar);
}
exports.isArray = isArray;
function isBoolean(arg) {
return typeof arg === 'boolean';
}
exports.isBoolean = isBoolean;
function isNull(arg) {
return arg === null;
}
exports.isNull = isNull;
function isNullOrUndefined(arg) {
return arg == null;
}
exports.isNullOrUndefined = isNullOrUndefined;
function isNumber(arg) {
return typeof arg === 'number';
}
exports.isNumber = isNumber;
function isString(arg) {
return typeof arg === 'string';
}
exports.isString = isString;
function isSymbol(arg) {
return typeof arg === 'symbol';
}
exports.isSymbol = isSymbol;
function isUndefined(arg) {
return arg === void 0;
}
exports.isUndefined = isUndefined;
function isRegExp(re) {
return isObject(re) && objectToString(re) === '[object RegExp]';
}
exports.isRegExp = isRegExp;
function isObject(arg) {
return typeof arg === 'object' && arg !== null;
}
exports.isObject = isObject;
function isDate(d) {
return isObject(d) && objectToString(d) === '[object Date]';
}
exports.isDate = isDate;
function isError(e) {
return isObject(e) &&
(objectToString(e) === '[object Error]' || e instanceof Error);
}
exports.isError = isError;
function isFunction(arg) {
return typeof arg === 'function';
}
exports.isFunction = isFunction;
function isPrimitive(arg) {
return arg === null ||
typeof arg === 'boolean' ||
typeof arg === 'number' ||
typeof arg === 'string' ||
typeof arg === 'symbol' || // ES6 symbol
typeof arg === 'undefined';
}
exports.isPrimitive = isPrimitive;
function isBuffer(arg) {
return Buffer.isBuffer(arg);
}
exports.isBuffer = isBuffer;
function objectToString(o) {
return Object.prototype.toString.call(o);
}

View File

@@ -0,0 +1,37 @@
{
"name": "core-util-is",
"version": "1.0.1",
"description": "The `util.is*` functions introduced in Node v0.12.",
"main": "lib/util.js",
"repository": {
"type": "git",
"url": "git://github.com/isaacs/core-util-is.git"
},
"keywords": [
"util",
"isBuffer",
"isArray",
"isNumber",
"isString",
"isRegExp",
"isThis",
"isThat",
"polyfill"
],
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
"url": "http://blog.izs.me/"
},
"license": "MIT",
"bugs": {
"url": "https://github.com/isaacs/core-util-is/issues"
},
"readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n",
"readmeFilename": "README.md",
"homepage": "https://github.com/isaacs/core-util-is#readme",
"_id": "core-util-is@1.0.1",
"_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538",
"_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz",
"_from": "core-util-is@>=1.0.0 <1.1.0"
}

View File

@@ -0,0 +1,106 @@
// Copyright Joyent, Inc. and other Node contributors.
//
// Permission is hereby granted, free of charge, to any person obtaining a
// copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to permit
// persons to whom the Software is furnished to do so, subject to the
// following conditions:
//
// The above copyright notice and this permission notice shall be included
// in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
// USE OR OTHER DEALINGS IN THE SOFTWARE.
// NOTE: These type checking functions intentionally don't use `instanceof`
// because it is fragile and can be easily faked with `Object.create()`.
function isArray(ar) {
return Array.isArray(ar);
}
exports.isArray = isArray;
function isBoolean(arg) {
return typeof arg === 'boolean';
}
exports.isBoolean = isBoolean;
function isNull(arg) {
return arg === null;
}
exports.isNull = isNull;
function isNullOrUndefined(arg) {
return arg == null;
}
exports.isNullOrUndefined = isNullOrUndefined;
function isNumber(arg) {
return typeof arg === 'number';
}
exports.isNumber = isNumber;
function isString(arg) {
return typeof arg === 'string';
}
exports.isString = isString;
function isSymbol(arg) {
return typeof arg === 'symbol';
}
exports.isSymbol = isSymbol;
function isUndefined(arg) {
return arg === void 0;
}
exports.isUndefined = isUndefined;
function isRegExp(re) {
return isObject(re) && objectToString(re) === '[object RegExp]';
}
exports.isRegExp = isRegExp;
function isObject(arg) {
return typeof arg === 'object' && arg !== null;
}
exports.isObject = isObject;
function isDate(d) {
return isObject(d) && objectToString(d) === '[object Date]';
}
exports.isDate = isDate;
function isError(e) {
return isObject(e) && objectToString(e) === '[object Error]';
}
exports.isError = isError;
function isFunction(arg) {
return typeof arg === 'function';
}
exports.isFunction = isFunction;
function isPrimitive(arg) {
return arg === null ||
typeof arg === 'boolean' ||
typeof arg === 'number' ||
typeof arg === 'string' ||
typeof arg === 'symbol' || // ES6 symbol
typeof arg === 'undefined';
}
exports.isPrimitive = isPrimitive;
function isBuffer(arg) {
return arg instanceof Buffer;
}
exports.isBuffer = isBuffer;
function objectToString(o) {
return Object.prototype.toString.call(o);
}

Some files were not shown because too many files have changed in this diff Show More