A finger held in the air exhibits microvibrations, which are reduced when it touches a static object. When a finger moves along a surface, the friction between them produces vibrations, which can not be produced with a free-moving finger in the air. With an inertial measurement unit (IMU) capturing such motion characteristics, we demonstrate the feasibility to detect contact between the finger and static objects. We call our technique ActualTouch. Studies show that a single nail-mounted IMU on the index finger provides sufficient data to train a binary touch status classifier (i.e., touch vs. no-touch), with an accuracy above 95%, generalised across users. This model, trained on a rigid tabletop surface, was found to retain an average accuracy of 96% for 7 other types of everyday surfaces with varying rigidity, and in walking and sitting scenarios where no touch occurred. ActualTouch can be combined with other interaction techniques, such as in a uni-stroke gesture recogniser on arbitrary surfaces, where touch status from ActualTouch is used to delimit the motion gesture data that feed into the recogniser. We demonstrate the potential of ActualTouch in a range of scenarios, such as interaction for augmented reality applications, and leveraging daily surfaces and objects for.