While it was common to think of calibrations in terms of "maximum ages to go with the minimum ages" in the 1990s and early 2000s, the field has moved on, and this is not a common practice anymore. With some notable exceptions (e.g., treePL), modern software packages (e.g., MCMCTree, MrBayes, BEAST 1, BEAST 2, RevBayes) generally give the user the option to specify a parametric calibration density with support on [0, infinity) or [offset, infinity), where the offset represents the minimum age. Exponential, lognormal, and truncated Cauchy densities are often used, and there is now a moderately large body of literature on how these can be made less arbitrary by fitting them to the distribution of fossil occurrences through time. The user still can (and I've done this before) choose to treat, say, the 95th percentile of an exponential as a "soft" maximum and scale the rate parameter so as to place 95% of the total probability mass between the minimum and maximum ages, but that's just one possible way to think about it. Parametric densities also give you the freedom to specify your prior beliefs in different ways, some of which may make more sense depending on the situation: e.g., using a mean and a standard deviation, the distance between the minimum and the mode, the 95% prior credibility interval, etc. Specifying a soft maximum and parameterizing your density accordingly can help avoid implausibly old age estimates when your fossil record sucks, but when it doesn't, the framework allows you to do much better than that.