stream-schema

Streaming JSON Parser with Schema Validation

Parse LLM outputs token by token. Get real-time partial results with incremental validation.

Incremental Parsing

Get partial results instantly as JSON streams in

Schema Validation

Validate against JSON Schema as data arrives

LLM Error Recovery

Auto-fix trailing commas, unquoted keys, etc.

Streaming Input
A typical user object from an LLM
Click "Start Streaming" to begin...
0 bytes processed0 chunks received
Parsed Output
Real-time partial results from stream-schema
Waiting for data...
Schema Definition
The JSON Schema used for validation
{
  "type": "object",
  "properties": {
    "name": {
      "type": "string"
    },
    "age": {
      "type": "number"
    },
    "email": {
      "type": "string",
      "format": "email"
    },
    "interests": {
      "type": "array",
      "items": {
        "type": "string"
      }
    }
  },
  "required": [
    "name",
    "email"
  ]
}
Usage Example
How to use stream-schema in your code
import { createStreamParser } from 'stream-schema';

const schema = {
  "type": "object",
  "properties": {
    "name": {
      "type": "string"
    },
    "age": {
      "type": "number"
    },
    "email": {
      "type": "string",
      "format": "email"
    },
    "interests": {
      "type": "array",
      "items": {
        "type": "string"
      }
    }
  },
  "required": [
    "name",
    "email"
  ]
};

const parser = createStreamParser(schema, {
  events: {
    onPartialObject: (data) => {
      // Update UI with partial data
      renderPartialUI(data);
    },
    onCompleteField: (field, value) => {
      console.log(`✓ ${field} completed:`, value);
    },
    onValidationError: (error) => {
      console.warn('Validation:', error);
    }
  }
});

// Feed chunks as they arrive from LLM
for await (const chunk of llmStream) {
  const result = parser.feed(chunk);
  
  if (result.complete) {
    return result.data; // Fully typed!
  }
}