summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorRich Trott <rtrott@gmail.com>2018-11-05 20:40:07 -0800
committerMyles Borins <mylesborins@google.com>2018-12-03 13:32:56 -0500
commit1609ddaa74d461240ab928fe1dfa8449934d1e5e (patch)
treeb75385e134ff663bc4d15d1048c3c5bce58adc94
parentdddb466f592e2ccfe15fe2d2cf1280ce4e097dd9 (diff)
downloadnode-new-1609ddaa74d461240ab928fe1dfa8449934d1e5e.tar.gz
doc: fix minor text issues in stream.md
Implement several minor grammar, punctuation, and style fixes in stream.md. PR-URL: https://github.com/nodejs/node/pull/24116 Reviewed-By: Richard Lau <riclau@uk.ibm.com> Reviewed-By: Daniel Bevenius <daniel.bevenius@gmail.com>
-rw-r--r--doc/api/stream.md20
1 files changed, 10 insertions, 10 deletions
diff --git a/doc/api/stream.md b/doc/api/stream.md
index bcec902afb..fbb8ad78fa 100644
--- a/doc/api/stream.md
+++ b/doc/api/stream.md
@@ -46,7 +46,7 @@ There are four fundamental stream types within Node.js:
* [`Transform`][] - `Duplex` streams that can modify or transform the data as it
is written and read (for example, [`zlib.createDeflate()`][]).
-Additionally this module includes the utility functions [pipeline][] and
+Additionally, this module includes the utility functions [pipeline][] and
[finished][].
### Object Mode
@@ -97,7 +97,7 @@ is to limit the buffering of data to acceptable levels such that sources and
destinations of differing speeds will not overwhelm the available memory.
Because [`Duplex`][] and [`Transform`][] streams are both `Readable` and
-`Writable`, each maintain *two* separate internal buffers used for reading and
+`Writable`, each maintains *two* separate internal buffers used for reading and
writing, allowing each side to operate independently of the other while
maintaining an appropriate and efficient flow of data. For example,
[`net.Socket`][] instances are [`Duplex`][] streams whose `Readable` side allows
@@ -388,7 +388,7 @@ changes:
not operating in object mode, `chunk` must be a string, `Buffer` or
`Uint8Array`. For object mode streams, `chunk` may be any JavaScript value
other than `null`.
-* `encoding` {string} The encoding, if `chunk` is a string
+* `encoding` {string} The encoding if `chunk` is a string
* `callback` {Function} Optional callback for when the stream is finished
* Returns: {this}
@@ -531,7 +531,7 @@ not draining may lead to a remotely exploitable vulnerability.
Writing data while the stream is not draining is particularly
problematic for a [`Transform`][], because the `Transform` streams are paused
-by default until they are piped or an `'data'` or `'readable'` event handler
+by default until they are piped or a `'data'` or `'readable'` event handler
is added.
If the data to be written can be generated or fetched on demand, it is
@@ -610,7 +610,7 @@ until a mechanism for either consuming or ignoring that data is provided. If
the consuming mechanism is disabled or taken away, the `Readable` will *attempt*
to stop generating the data.
-For backwards compatibility reasons, removing [`'data'`][] event handlers will
+For backward compatibility reasons, removing [`'data'`][] event handlers will
**not** automatically pause the stream. Also, if there are piped destinations,
then calling [`stream.pause()`][stream-pause] will not guarantee that the
stream will *remain* paused once those destinations drain and ask for more data.
@@ -1342,7 +1342,7 @@ Especially useful in error handling scenarios where a stream is destroyed
prematurely (like an aborted HTTP request), and will not emit `'end'`
or `'finish'`.
-The `finished` API is promisify'able as well;
+The `finished` API is promisify-able as well;
```js
const finished = util.promisify(stream.finished);
@@ -1394,7 +1394,7 @@ pipeline(
);
```
-The `pipeline` API is promisify'able as well:
+The `pipeline` API is promisify-able as well:
```js
const pipeline = util.promisify(stream.pipeline);
@@ -1866,7 +1866,7 @@ changes:
any JavaScript value.
* `encoding` {string} Encoding of string chunks. Must be a valid
`Buffer` encoding, such as `'utf8'` or `'ascii'`.
-* Returns: {boolean} `true` if additional chunks of data may continued to be
+* Returns: {boolean} `true` if additional chunks of data may continue to be
pushed; `false` otherwise.
When `chunk` is a `Buffer`, `Uint8Array` or `string`, the `chunk` of data will
@@ -2279,7 +2279,7 @@ The `callback` function must be called only when the current chunk is completely
consumed. The first argument passed to the `callback` must be an `Error` object
if an error occurred while processing the input or `null` otherwise. If a second
argument is passed to the `callback`, it will be forwarded on to the
-`readable.push()` method. In other words the following are equivalent:
+`readable.push()` method. In other words, the following are equivalent:
```js
transform.prototype._transform = function(data, encoding, callback) {
@@ -2326,7 +2326,7 @@ less powerful and less useful.
guaranteed. This meant that it was still necessary to be prepared to receive
[`'data'`][] events *even when the stream was in a paused state*.
-In Node.js 0.10, the [`Readable`][] class was added. For backwards
+In Node.js 0.10, the [`Readable`][] class was added. For backward
compatibility with older Node.js programs, `Readable` streams switch into
"flowing mode" when a [`'data'`][] event handler is added, or when the
[`stream.resume()`][stream-resume] method is called. The effect is that, even