JavaScript Basic
How do you calculate the average of an array in JavaScript?
Method 1: reduce (most common)
function average(arr) {
if (arr.length === 0) return 0;
const sum = arr.reduce((acc, val) => acc + val, 0);
return sum / arr.length;
}
average([1, 2, 3, 4, 5]); // 3
Method 2: for loop
function average(arr) {
if (arr.length === 0) return 0;
let sum = 0;
for (const num of arr) {
sum += num;
}
return sum / arr.length;
}
One-liner
const avg = arr => arr.reduce((a, b) => a + b, 0) / arr.length;
avg([10, 20, 30]); // 20
Edge Cases to Handle
- Empty arrays must be handled to avoid dividing by 0 (which produces
NaN) - Non-numeric values in the array may cause
NaNresults
function safeAverage(arr) {
const nums = arr.filter(n => typeof n === 'number' && !isNaN(n));
if (nums.length === 0) return 0;
return nums.reduce((a, b) => a + b, 0) / nums.length;
}
✦ AI Mock Interview
Type your answer and get instant AI feedback
Sign in to use AI scoring
