The approach to reading multiple JSON values from files or streams in Python depends on the data format and storage method. Typically, there are two common scenarios for handling multiple JSON objects in a file:
1. JSON Array
If the JSON objects in the file are stored as an array, for example:
json[ {"name": "Alice", "age": 25}, {"name": "Bob", "age": 30}, {"name": "Cathy", "age": 22} ]
In this case, you can directly use Python's json module to load the entire array. Here is the corresponding code example:
pythonimport json # Open and read the JSON file with open('data.json', 'r') as file: data = json.load(file) # data is now a list of dictionaries for item in data: print(f"Name: {item['name']}, Age: {item['age']}")
2. Multiple JSON Objects
If the file contains multiple independent JSON objects, each being a complete JSON but not enclosed in a JSON array, for example:
json{"name": "Alice", "age": 25} {"name": "Bob", "age": 30} {"name": "Cathy", "age": 22}
This format is known as JSON Lines or newline-delimited JSON. For this case, you need to read the file line by line and parse each line individually:
pythonimport json # Open the file with open('data.json', 'r') as file: # Read and parse line by line for line in file: item = json.loads(line) print(f"Name: {item['name']}, Age: {item['age']}")
Advanced Scenario: Large Files or Stream Processing
If you need to read JSON data from very large files or real-time data streams, consider using libraries like ijson that support iterative parsing without loading all data into memory at once.
pythonimport ijson # Open a large file with open('very_large_data.json', 'r') as file: # Extract items using ijson objects = ijson.items(file, 'item') for obj in objects: print(f"Name: {obj['name']}, Age: {obj['age']}")
Using this approach, you can effectively handle large-scale JSON data, reduce memory usage, and improve program performance and efficiency.