🐎Benchmark standard

Benchmarks are not all equal, but we can at least make them replicable.

We regularly compare the performance of our endpoints with those of other industry providers. This rigorous benchmarking assists us in establishing a performance baseline, ensuring our endpoints consistently match or even surpass the competition.

Testing Setup

  • Server Configuration: A dedicated CPU-Optimized DigitalOcean Droplet, equipped with 16 vCPUs and 32GB of Memory.

  • Server Location: Frankfurt (FRA1).

  • Benchmarking Tool: For consistency and precision, we rely on Flood as our go-to benchmarking software.

What's in Our Benchmark Report?

  • Details on providers and the exact flood commands executed.

  • Percentile Latency metrics, including P50, P90, and P99.

  • Throughput rates.

  • Success rate metrics.

Note: We also keep a vigilant eye on our benchmark server during tests. This diligence guarantees the absence of any CPU/IO bottlenecks, assuring that our benchmarks focus entirely on evaluating the RPC endpoints' performance.

Last updated