Various wearable sensors capturing body vibration, jaw movement, hand gesture, etc., have shown promise in detecting when one is currently eating. However, based on existing literature and user surveys conducted in this study, we argue that a Just-in-Time eating intervention, triggered upon detecting a current eating event, is sub-optimal. An eating intervention triggered at "About-To-Eat" moments could provide users with a further opportunity to adopt a better and healthier eating behavior. In this work, we present a wearable sensing framework that predicts "About-To-Eat" moments and the "Time until the Next Eating Event". The wearable sensing framework consists of an array of sensors that capture physical activity, location, heart rate, electrodermal activity, skin temperature and caloric expenditure. Using signal processing and machine learning on this raw multimodal sensor stream, we train an "Aboutto-Eat" moment classifier that reaches an average recall of 77%. The "Time until the Next Eating Event" regression model attains a correlation coefficient of 0.49. Personalization further increases the performance of both of the models to an average recall of 85% and correlation coefficient of 0.65. The contributions of this paper include user surveys related to this problem, the design of a system to predict about to eat moments and a regression model used to train multimodal sensory data in real time for potential eating interventions for the user.