Skip to content

Commit c2d79a6

Browse files
author
mochatek
committed
docs: Update docs
1 parent a2f4446 commit c2d79a6

File tree

2 files changed

+46
-32
lines changed

2 files changed

+46
-32
lines changed

README.md

Lines changed: 45 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,11 @@
55
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
66
![Coverage](https://img.shields.io/badge/coverage-96.88%25-brightgreen)
77

8-
Specialized Node.js library for memory-efficient operations on JSON array files. Stream individual elements from large JSON array files and append new elements without loading the entire array into memory. Perfect for processing large-scale JSON array datasets without memory limitations.
8+
Specialized Node.js library for memory-efficient operations on JSON arrays. Stream individual elements from large JSON arrays (files, network responses etc.) and append elements to array files without loading the entire array into memory. Perfect for processing large-scale JSON array datasets without memory limitations.
99

1010
## Why Use This?
1111

12-
- 🎯 **Specialized**: Purpose-built for JSON array files
12+
- 🎯 **Specialized**: Purpose-built for JSON arrays
1313
- 💾 **Memory Efficient**: Process arrays of any size without loading them entirely
1414
-**High Performance**: Optimized streaming and batch operations
1515
- ✍️ **Direct Updates**: Append elements without rewriting the entire file
@@ -21,76 +21,90 @@ Specialized Node.js library for memory-efficient operations on JSON array files.
2121
npm install jsonarrayfs
2222
```
2323

24-
## Usage
24+
## Examples
25+
26+
### 1. Stream from File
27+
28+
Processs a large JSON array file (e.g., application logs) without loading it into memory:
2529

2630
```typescript
27-
import { JsonArrayStream, appendToJsonArrayFile } from "jsonarrayfs";
31+
import { JsonArrayStream } from "jsonarrayfs";
2832
import { createReadStream } from "node:fs";
29-
import { Transform } from "node:stream";
3033

31-
// Process a large application log file (10GB+ JSON array)
34+
// Analyze logs: Count errors and slow responses
3235
const fileStream = createReadStream("app.log.json");
3336
const arrayStream = new JsonArrayStream("utf8");
3437

35-
// Analyze logs: Count errors and slow responses
3638
let errorCount = 0;
3739
let slowResponses = 0;
3840

3941
for await (const log of fileStream.pipe(arrayStream)) {
4042
if (log !== JsonArrayStream.NULL) {
41-
if (log.level === "error") {
42-
errorCount++;
43-
console.error(`Error in ${log.service}: ${log.message}`);
44-
}
45-
46-
if (log.responseTime > 1000) {
47-
slowResponses++;
48-
console.warn(`Slow response: ${log.path} (${log.responseTime}ms)`);
49-
}
43+
if (log.level === "error") errorCount++;
44+
if (log.responseTime > 1000) slowResponses++;
5045
}
5146
}
5247

5348
console.log(
54-
`Analysis complete: ${errorCount} errors, ${slowResponses} slow responses`,
49+
`Analysis complete: Found ${errorCount} errors, ${slowResponses} slow responses`,
5550
);
51+
```
52+
53+
### 2. Stream from Network
54+
55+
Process a JSON array from an API response:
56+
57+
```typescript
58+
import { JsonArrayStream } from "jsonarrayfs";
59+
import { get } from "node:https";
60+
61+
get("https://api.example.com/json-array-data", (res) => {
62+
const arrayStream = new JsonArrayStream("utf8");
63+
64+
res.pipe(arrayStream).on("data", (item) => {
65+
console.log(`Got item: ${item === JsonArrayStream.NULL ? null : item}`);
66+
});
67+
});
68+
```
69+
70+
### 3. Append to File
71+
72+
Append new elements to an existing JSON array file:
73+
74+
```typescript
75+
import { JsonArrayStream, appendToJsonArrayFile } from "jsonarrayfs";
5676

5777
// Append new log entries
5878
const newLogs = [
5979
{
6080
timestamp: Date.now(),
6181
level: "info",
62-
service: "auth",
63-
path: "/api/login",
64-
responseTime: 245,
6582
message: "User login successful",
83+
responseTime: 245,
6684
},
6785
{
6886
timestamp: Date.now(),
6987
level: "info",
70-
service: "auth",
71-
path: "/api/login",
72-
responseTime: 1245,
7388
message: "User login successful",
89+
responseTime: 1245,
7490
},
7591
null,
7692
{
7793
timestamp: Date.now(),
7894
level: "error",
79-
service: "payment",
80-
path: "/api/checkout",
81-
responseTime: 1532,
8295
message: "Database connection timeout",
96+
responseTime: 1532,
8397
},
8498
];
8599

86100
await appendToJsonArrayFile("app.log.json", "utf8", ...newLogs);
87101
```
88102

89-
## API
103+
## API Reference
90104

91-
### JsonArrayStream
105+
### `JsonArrayStream`
92106

93-
A transform stream that reads JSON array files and emits elements one by one for efficient processing. When processing arrays containing `null` values, it uses a special sentinel value (`JsonArrayStream.NULL`) to distinguish between JSON `null` and stream EOF.
107+
A transform stream that parses JSON array elements one by one for efficient processing. When processing arrays containing `null` values, it uses a special sentinel value (`JsonArrayStream.NULL`) to distinguish between JSON `null` and stream EOF.
94108

95109
#### Constructor
96110

@@ -100,7 +114,7 @@ new JsonArrayStream(encoding?: string)
100114

101115
##### Parameters
102116

103-
- `encoding` (string, optional): File encoding (default: 'utf8')
117+
- `encoding` (string, optional): Content encoding (default: 'utf8')
104118

105119
#### Properties
106120

@@ -111,7 +125,7 @@ new JsonArrayStream(encoding?: string)
111125
- `data`: Emitted for each array element
112126
- `error`: Emitted when parsing fails or input is invalid
113127

114-
### appendToJsonArrayFile
128+
### `appendToJsonArrayFile`
115129

116130
Appends elements to a JSON array file efficiently without loading the entire file into memory.
117131

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"name": "jsonarrayfs",
33
"version": "1.2.1",
4-
"description": "Specialized Node.js library for memory-efficient operations on JSON array files. Stream individual elements from large JSON array files and append new elements without loading the entire array into memory. Perfect for processing large-scale JSON array datasets without memory limitations.",
4+
"description": "Specialized Node.js library for memory-efficient operations on JSON arrays. Stream individual elements from large JSON arrays (files, network responses etc.) and append elements to array files without loading the entire array into memory. Perfect for processing large-scale JSON array datasets without memory limitations.",
55
"main": "./dist/index.js",
66
"module": "./dist/index.mjs",
77
"types": "./dist/index.d.ts",

0 commit comments

Comments
 (0)