Bug 1399325 - Do not allow parsed URLs to exceed max length r=mayhemer

When normalizing the spec, some characters get percent encoded, so even if the original input was shorter than the max length, the final result could be longer.

MozReview-Commit-ID: 78IDM7Hoa55

--HG--
extra : rebase_source : b57caca6e5c55bf290b15e2f084e72e09f051c8f
This commit is contained in:
Valentin Gosu 2018-03-23 08:49:41 +01:00
parent 290fc5ad94
commit 8376d2c957

View File

@ -781,6 +781,12 @@ nsStandardURL::BuildNormalizedSpec(const char *spec,
URLSegment query(mQuery);
URLSegment ref(mRef);
// The encoded string could be longer than the original input, so we need
// to check the final URI isn't longer than the max length.
if (approxLen + 1 > (uint32_t) net_GetURLMaxLength()) {
return NS_ERROR_MALFORMED_URI;
}
//
// generate the normalized URL string
//
@ -931,6 +937,9 @@ nsStandardURL::BuildNormalizedSpec(const char *spec,
}
mSpec.SetLength(strlen(buf));
NS_ASSERTION(mSpec.Length() <= approxLen, "We've overflowed the mSpec buffer!");
MOZ_ASSERT(mSpec.Length() <= (uint32_t) net_GetURLMaxLength(),
"The spec should never be this long, we missed a check.");
return NS_OK;
}