Why JSON Best Practices Matter
JSON is everywhere: REST APIs, configuration files, database documents, message queues, and more. When teams work with JSON inconsistently, small problems compound — mismatched key names break consumers, unexpected null values crash parsers, and deeply nested structures become impossible to maintain. Following clear conventions from the start saves significant time and frustration.
1. Use Consistent Key Naming Conventions
Pick one naming convention and apply it everywhere. The two most common are camelCase (used by most JavaScript/TypeScript APIs) and snake_case (popular in Python and Ruby APIs). Avoid mixing them — a single API should not have both firstName and last_name.
Many teams use camelCase for JSON because JavaScript uses it natively, reducing the need for transformation layers.
2. Be Explicit About Null vs. Missing Fields
There is a meaningful difference between a field that is explicitly null and a field that is absent from the object. In general:
- Absent field — the property does not apply or was never set.
null— the property applies but has no value.
Decide on a convention for your API and document it. Do not send null fields unless the consumer needs to distinguish between "not set" and "explicitly empty."
3. Avoid Deep Nesting
JSON structures that are 5 or more levels deep become hard to read, debug, and query. Flatten your data when possible, and consider breaking deeply nested objects into separate resources (with IDs referencing each other) if you are designing an API.
// Prefer flat structures
{
"userId": "123",
"userName": "Alice",
"addressCity": "New York"
}
// Over deeply nested ones
{
"user": {
"id": "123",
"profile": {
"address": {
"city": "New York"
}
}
}
}
4. Use Arrays for Lists, Objects for Named Properties
It is tempting to use objects with numeric keys to represent ordered lists, but arrays are the right tool for ordered sequences. Use objects only when the keys are meaningful names.
// Good: array for a list
{ "tags": ["javascript", "api", "json"] }
// Bad: object with numeric keys
{ "tags": { "0": "javascript", "1": "api" } }
5. Always Validate JSON Before Parsing
Never assume incoming JSON is valid. In production code, wrap your parsing calls in try/catch blocks (or use a schema validation library like Zod or Ajv) to handle malformed input gracefully. Never let a JSON parse error crash your application.
try {
const data = JSON.parse(rawInput);
// process data
} catch (e) {
console.error("Invalid JSON:", e.message);
}
6. Use ISO 8601 for Dates
JSON has no native date type. Always serialize dates as ISO 8601 strings (e.g., "2026-03-08T14:30:00Z"). This format is unambiguous, sortable as a string, and universally parseable across languages.
7. Keep Payloads Small
Only include fields that the consumer actually needs. Large JSON payloads increase latency and bandwidth costs. Consider implementing field filtering (like GraphQL or sparse fieldsets in JSON:API) for endpoints that serve many different consumers with different data needs.
8. Version Your APIs
As your JSON schemas evolve, breaking changes (removing fields, renaming keys, changing types) will break consumers. Versioning your API (e.g., /api/v1/) gives consumers time to migrate before deprecated formats are removed.
Conclusion
Good JSON practices are not about following arbitrary rules — they are about reducing confusion for the people and systems that consume your data. Consistent key naming, thoughtful null handling, shallow nesting, and proper validation all contribute to APIs and configs that are a pleasure to work with. Start with these practices on your next project and you will thank yourself later.