Multimeter wrong voltage reading

If you have some way of checking the voltage of the battery in the meter, that could be the issue.

Most digital multimeters use a fixed reference that is about 6V. If the battery voltage falls below six volts, they start reading high. Since the meter is new, that would mean a bad battery.

If this is the case, the inaccuracy should start getting progressively worse.

For what it’s worth, at a previous workplace, we kept a mix of both high quality (Fluke) and low quality (Centech / Harbor Freight) multimeters. They wanted us to use the cheap meters when not doing critical work to save money. All of them got calibrated and certified. They all consistently measured within the accuracy specified on the box, which to be sure, was tighter on the Flukes.

Looking up the current cheap Centech on Harbor Freight’s site, they specify /- 1% and/- 2 digits (they don’t actually clarify if that is % of reading or % of range - if unstated, it should be the more restrictive % of reading). So if the battery is exactly at 4.2V, this meter would be in-spec if I was using the 250V range setting (which only displays 1 decimal place, if I remember right), it is within spec to 4.4V (within 2 digits).

If I turn it down the dial to the 20V range and get 3 decimal places, it now should not read higher than 4.244 V (4.2 + 1% + 2 digits).

Note that battery voltage can vary because they are allowed tolerances, too. When a manufacturer says to charge a battery to 4.2V, they know the charger will not have perfect accuracy, so they usually mean something like +/- 0.05V. Such a battery can acceptably be as high as 4.25V (I think in reality, most charger manufacturers bias their chargers a little bit low to avoid concerns).