In Node.js, you can use the built-in Buffer class for Base64 encoding. The Buffer class is a global object in Node.js designed for handling binary data.
Here is an example of encoding a string to Base64:
javascript// Assuming a string that needs to be Base64 encoded let str = "Hello, World!"; // Create a Buffer instance to convert the string into binary data let buffer = Buffer.from(str); // Use the toString method to encode the data as Base64 let base64Encoded = buffer.toString('base64'); console.log(base64Encoded); // Output the Base64-encoded string
In this example, we first create a new Buffer instance using the Buffer.from() method, which holds the original string data. Then, we call the toString() method and pass 'base64' as the argument, which returns a Base64-encoded string.
If you want to encode file content as Base64, you can first read the file content into a Buffer using Node.js's file system module (fs), then perform a similar conversion. Here is an example of converting file content to Base64:
javascriptconst fs = require('fs'); // Asynchronously read the file content and perform Base64 encoding fs.readFile('path/to/file', (err, data) => { if (err) { console.error('Error reading the file.', err); return; } // data is a Buffer instance containing the file's binary content let base64Encoded = data.toString('base64'); console.log(base64Encoded); // Output the Base64-encoded file content }); // If you need to read the file synchronously, use readFileSync try { let data = fs.readFileSync('path/to/file'); let base64Encoded = data.toString('base64'); console.log(base64Encoded); // Output the Base64-encoded file content } catch (err) { console.error('Error reading the file.', err); }
When handling large files, be mindful of memory usage, as Buffer loads the entire file content into memory. If the file is very large, you may need to use stream processing methods to read and encode the data in chunks to reduce memory consumption.