Proxy Report v0.2

Unbiased Proxy Benchmark 24/7

?
Updated 51 min and 24 sec ago

Our Testing Methodology

At Proxy Report, we are committed to providing transparent, data-driven comparisons of proxy providers. Our benchmarking system is fully automated and measures real-world performance across multiple dimensions—free from manual bias or interference.

Testing Framework

All tests are conducted automatically from three geographic regions: North America, Europe, and Asia. Each provider is measured through three independent systems:

  • IP Checker: Monitors request success rate and IP rotation quality.
  • Latency Monitor: Measures proxy responsiveness using our custom-built PingWithProxy tool.
  • Speed Tests: Tracks real download throughput under consistent workloads.

Latency Measurement

For high-precision latency, we use PingWithProxy — a tool created by our team for accurate HTTP(S) proxy measurement.

Key features:

  • Nanosecond-precision TTFB measurements using process.hrtime.bigint()
  • TCP_NODELAY (Nagle's algorithm disabled) for reduced delay
  • Keep-alive support for stable, steady-state latency
  • Targets IP addresses directly to avoid DNS delays
  • Concurrency and rounds fully configurable

Metrics We Track

  • Latency: Trimmed average of TTFB (5th–95th percentile)
  • Speed: Median download throughput in Mbps
  • Success Rate: Successful vs. total HTTP requests
  • Failure Rate: All failed requests (timeouts, errors, etc.)
  • IP Diversity: Proxy IP rotation quality, scored on a logarithmic curve

Data Collection & Metrics (With Pseudocode)

All data is collected from our latency, speed, and IP checker databases. The goal is to ensure fairness by applying consistent filters and calculations across all providers.

1. Trimmed Mean Latency

We remove the slowest and fastest 5% of results to avoid skew from outliers:

-- SQL
WITH ranked AS (
  SELECT median_ms,
         PERCENT_RANK() OVER (ORDER BY median_ms) AS pr
  FROM latency_table
  WHERE status = 'success'
    AND date >= NOW() - INTERVAL ? DAYS
)
SELECT AVG(median_ms) AS avg_latency
FROM ranked
WHERE pr BETWEEN 0.05 AND 0.95;

2. Median Speed

Rather than average, we calculate the median download throughput:

-- SQL
SELECT AVG(speed_mbps) AS avg_speed
FROM (
  SELECT speed_mbps
  FROM (
    SELECT speed_mbps,
           ROW_NUMBER() OVER (ORDER BY speed_mbps) AS rn,
           COUNT(*) OVER () AS cnt
    FROM speed_table
    WHERE status = 'success'
      AND date >= NOW() - INTERVAL ? DAYS
  ) AS ordered
  WHERE rn IN (FLOOR((cnt+1)/2), CEIL((cnt+1)/2))
) AS med;

3. Success Rate

-- SQL
SELECT
  COUNT(*) AS total_requests,
  SUM(CASE WHEN status = 'success' THEN 1 ELSE 0 END) AS successful_requests,
  100.0 * successful_requests / total_requests AS success_rate
FROM ipchecker_table
WHERE date >= NOW() - INTERVAL ? DAYS;

4. Final Scoring Logic

We compute a weighted score and apply a threshold bonus:

// JavaScript pseudocode
score = (latencyScore * 0.25) +
        (successScore * 0.35) +
        (speedScore * 0.25) +
        (ipUniquenessScore * 0.15);

if (all thresholds met) {
  score += 0.5;
}

score = Math.min(score, 10);

5. IP Diversity Scaling

// JavaScript pseudocode
if (uniqueIps > 1) {
  ipUniqueness = Math.min(99, 50 * Math.log10(1 + uniqueIps / 500));
} else {
  ipUniqueness = 0;
}

Scoring System

Providers are rated on a 0–10 scale using a weighted formula:

Metric Weight
Success Rate 35%
Latency 25%
Speed 25%
IP Uniqueness 15%

Providers that exceed minimum quality thresholds earn a small bonus of up to +0.5 for consistent reliability.

Grading System

Our letter grading system (A+ through F) is derived from the overall score and corresponds to the following ranges:

Grade Score Range Rating
A+ 9.5 - 10.0 Exceptional
A 9.0 - 9.4 Excellent
A− 8.5 - 8.9 Very Good
B+ 8.0 - 8.4 Good
B 7.5 - 7.9 Above Average
B− 7.0 - 7.4 Average
C+ 6.5 - 6.9 Fair
C 6.0 - 6.4 Below Average
D 5.0 - 5.9 Poor
F 0.0 - 4.9 Unacceptable

Performance Timeframes

  • 24 Hours: Real-time performance snapshot
  • 7 Days: Weekly average for mid-term reliability
  • 30 Days: Long-term trend analysis

Independence & Transparency

We maintain full independence from all providers. While some links may be affiliate-based, our testing and ranking systems are entirely automated and free from manual adjustments or paid influence. What you see is what the data shows.