distributions: support a zero max value in Zipf. There is no documentation that says zero isn't okay, and the closed interval [0, k] described by the documentation is perfectly well-defined even when k is zero. As far as I can tell, there is no reason *not* to support zero: a random variable that always returns the same value is still a random variable. absl::Uniform will happily generate on the interval [0, 1) for the same reason. PiperOrigin-RevId: 694649518 Change-Id: Ib940406f762a30e27c19c846c45bd908ae8411c3
diff --git a/absl/random/distributions_test.cc b/absl/random/distributions_test.cc index 850796e..4340aeb 100644 --- a/absl/random/distributions_test.cc +++ b/absl/random/distributions_test.cc
@@ -470,6 +470,13 @@ EXPECT_NEAR(6.5944, moments.mean, 2000) << moments; } +TEST_F(RandomDistributionsTest, ZipfWithZeroMax) { + absl::InsecureBitGen gen; + for (int i = 0; i < 100; ++i) { + EXPECT_EQ(0, absl::Zipf(gen, 0)); + } +} + TEST_F(RandomDistributionsTest, Gaussian) { std::vector<double> values(kSize);
diff --git a/absl/random/zipf_distribution.h b/absl/random/zipf_distribution.h index 0600cfc..15f03ee 100644 --- a/absl/random/zipf_distribution.h +++ b/absl/random/zipf_distribution.h
@@ -57,7 +57,7 @@ public: using distribution_type = zipf_distribution; - // Preconditions: k > 0, v > 0, q > 1 + // Preconditions: k >= 0, v > 0, q > 1 // The precondidtions are validated when NDEBUG is not defined via // a pair of assert() directives. // If NDEBUG is defined and either or both of these parameters take invalid @@ -152,7 +152,7 @@ : k_(k), q_(q), v_(v), one_minus_q_(1 - q) { assert(q > 1); assert(v > 0); - assert(k > 0); + assert(k >= 0); one_minus_q_inv_ = 1 / one_minus_q_; // Setup for the ZRI algorithm (pg 17 of the paper).