Q1. How do you read a file asynchronously in Node.js?
Use fs.readFile() with a callback. Example:
const fs = require('fs');
fs.readFile('example.txt', 'utf8', (err, data) => {
if (err) {
console.error(err);
return;
}
console.log(data);
});
The 'utf8' encoding returns the file content as a string. Without encoding, it returns a buffer. Always handle errors properly.Q2. How do you read a file synchronously?
Use fs.readFileSync(). Example:
try {
const data = fs.readFileSync('example.txt', 'utf8');
console.log(data);
} catch (err) {
console.error(err);
}
Synchronous methods block the event loop, so they're not recommended for production servers handling many requests. They're okay for initialization scripts or CLI tools.Q3. How do you read a file using promises?
Use fs.promises.readFile(). Example:
const fs = require('fs').promises;
async function readFile() {
try {
const data = await fs.readFile('example.txt', 'utf8');
console.log(data);
} catch (err) {
console.error(err);
}
}
This combines the benefits of async operations with cleaner syntax using async/await.Q4. How do you read large files efficiently?
For large files, use streams: fs.createReadStream(). This reads the file in chunks without loading the entire file into memory. Example:
const readStream = fs.createReadStream('largefile.txt', 'utf8');
readStream.on('data', (chunk) => {
processChunk(chunk);
});
readStream.on('end', () => {
console.log('Done');
});
Q5. How do you handle different encodings when reading files?
You can specify encoding as the second argument: 'utf8', 'ascii', 'base64', 'hex', etc. If you need to handle binary data, omit encoding to get a buffer. You can also specify encoding in the options object:
fs.readFile('file.txt', { encoding: 'utf8', flag: 'r' }, callback);
