Use Reduce to Filter and Map over Large Datasets

Share this video with your friends

Send Tweet

Learn how two common array functions - map() and filter() - are syntactic sugar for reduce operations. Learn how to use them, how to compose them, and how using reduce can give you a big performance boost over composing filters and maps over a large data set.

Jack
~ 8 years ago

Awesome and very helpful..Thanks a lot! reduce is the best array function::confirmed!!!

Jiwon
~ 8 years ago

thx for practical example!!

Mike
~ 7 years ago

Nicely done. I did a whole 40 minute talk on this very subject, but you have nicely reduced it down to its core in 8 minutes. Looking forward to the rest of the series.

h2t2
~ 5 years ago

Remarkable: I got the numbers much, much worse for the composition of the functions. Single numbers are not reliable (they will change in every run and you need to take an average of many runs. ). However, I never got a factor below 4. And I could achieve an incredible factor of 10

That was amazing. And it is not due to my arrow functions. They may be slower, but I would need a long time series for that.

~ 3 years ago

All my algorithm interview prep slogging has trained me to believe that both the .map().filter() strategy and the reduce() strategy both flatten to O(n) time complexity and the small difference isn't worth thinking about that much. We should be looking for bigger performance fish to fry or if we're really concerned about the scale of this data set, maybe to not to the work in single threaded javascript on the client. Am I wrong?