Benchmarking Work Group
Mandate
The Benchmark working group's purpose is to gain consensus for an agreed set of benchmarks that can be used to:
- Track and evangelize performance gains made between Node releases
- Avoid performance regressions between releases
Its responsibilities are:
- Identify 1 or more benchmarks that reflect customer usage. Likely need more than one to cover typical Node use cases including low-latency and high concurrency
- Work to get community consensus on the list chosen
- Add regular execution of chosen benchmarks to Node builds
- Track/publicize performance between builds/releases
The path forward is to:
- Define the important
use cases - Define the key
runtime attributes - Find/create benchmarks that provide good coverage for the
use cases and attributes
(current table)
See here for information about the infrastructure in place so far:
https://github.com/nodejs/benchmarking/blob/master/benchmarks/README.md
Logistics
Semi-monthly Meetings
Meetings of the working group typically occur every third Tuesday as shown on the
the node.js project calendar.
A few days before each meeting, an issue
will be created with the date and time of the meeting.
The issue will provide schedule logistics as well as
an agenda, links to meeting minutes, and
information about how to join as a participant or a viewer.
Current Project Team Members
- Michael Dawson (@mhdawson) Facilitaor
- Uttam Pawar (@uttampawar)
- Michael Paulson (@michaelbpaulson)
- Gareth Ellis (@gareth-ellis)
- Kunal Pathak (@kunalspathak)
- Jamie Davis (@davisjam)
Emeritus Project Team Members
- Trevor Norris (@trevnorris)
- Ali Sheikh (@ofrobots)
- Yosuke Furukawa (@yosuke-furukawa)
- Yunong Xiao (@yunong)
- Mark Leitch (@m-leitch)
- Surya V Duggirala (@suryadu)
- Wayne Andrews (@CurryKitten)
- Kyle Farnung (@kfarnung)
- Benedikt Meurer (@bmeurer)
- Sathvik Laxminarayan (@sathvikl)