Skip to content

Commit 98ff11d

Browse files
authored
Fix flaky test test_horrible_queries (#3857)
`test_horrible_queries` can be flaky, but it's very hard to detect. This commit fixes the flakiness. To understand the flakiness, let me put the original test code below: ```ruby # then that large mangled field values are caught 10.times do |c| get = "GET /#{rand_data(10,120)} HTTP/1.1\r\nX-Test: #{rand_data(1024, 1024+(c*1024), false)}\r\n\r\n" assert_raises Puma::HttpParserError do parser.execute({}, get, 0) parser.reset end end ``` The way I understand `rand_data(xxx, false)` is that the function should _always_ return invalid bytes (IOW cause the parser to raise an exception). However, it is possible for that function to _sometimes_ return valid bytes. The above test asserts that an error is raised when `parser.execute` is called, but if an exception gets raised, then `parser.reset` isn't called. If `parser.reset` isn't called, then subsequent calls to `parser.execute` will raise an exception _regardless_ of whether the input is parse-able or not. In other words, if the first iteration caused an exception, subsequent iterations are guaranteed to raise an exception. This commit does two things: 1. Pulls `parser.reset` outside of the assert_raises so that the flake is more obvious / reproducible 2. Changes `rand_data` to always add an invalid byte so that if the random data happens to contain parsable data, we'll at least have one byte guaranteed to fail.
1 parent da162d7 commit 98ff11d

1 file changed

Lines changed: 9 additions & 6 deletions

File tree

test/test_http11.rb

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -136,7 +136,10 @@ def rand_data(min, max, readable=true)
136136
if readable
137137
res << Digest(:SHA1).hexdigest(rand(count * 100).to_s) * (count / 40)
138138
else
139-
res << Digest(:SHA1).digest(rand(count * 100).to_s) * (count / 20)
139+
data = Digest(:SHA1).digest(rand(count * 100).to_s) * (count / 20)
140+
# Guarantee there is an invalid byte at the end of the string
141+
data.setbyte(data.bytesize - 1, 0x1)
142+
res << data
140143
end
141144

142145
res
@@ -208,8 +211,8 @@ def test_max_uri_path_length
208211
http = "GET #{path} HTTP/1.1\r\n\r\n"
209212
assert_raises Puma::HttpParserError do
210213
parser.execute(req, http, 0)
211-
parser.reset
212214
end
215+
parser.reset
213216
end
214217

215218
def test_horrible_queries
@@ -220,34 +223,34 @@ def test_horrible_queries
220223
get = "GET /#{rand_data(10,120)} HTTP/1.1\r\nX-#{rand_data(1024, 1024+(c*1024))}: Test\r\n\r\n"
221224
assert_raises Puma::HttpParserError do
222225
parser.execute({}, get, 0)
223-
parser.reset
224226
end
227+
parser.reset
225228
end
226229

227230
# then that large mangled field values are caught
228231
10.times do |c|
229232
get = "GET /#{rand_data(10,120)} HTTP/1.1\r\nX-Test: #{rand_data(1024, 1024+(c*1024), false)}\r\n\r\n"
230233
assert_raises Puma::HttpParserError do
231234
parser.execute({}, get, 0)
232-
parser.reset
233235
end
236+
parser.reset
234237
end
235238

236239
# then large headers are rejected too
237240
get = "GET /#{rand_data(10,120)} HTTP/1.1\r\n"
238241
get += "X-Test: test\r\n" * (80 * 1024)
239242
assert_raises Puma::HttpParserError do
240243
parser.execute({}, get, 0)
241-
parser.reset
242244
end
245+
parser.reset
243246

244247
# finally just that random garbage gets blocked all the time
245248
10.times do |c|
246249
get = "GET #{rand_data(1024, 1024+(c*1024), false)} #{rand_data(1024, 1024+(c*1024), false)}\r\n\r\n"
247250
assert_raises Puma::HttpParserError do
248251
parser.execute({}, get, 0)
249-
parser.reset
250252
end
253+
parser.reset
251254
end
252255
end
253256

0 commit comments

Comments
 (0)