By simulating a prediction-error filter using the dips picked from a dip spectrum, I devise an interpolation scheme. The procedure has three stages: (1) First of all, the dips of linear events in a given data set are estimated from the dip spectrum obtained from the slant stack. (2) Next, the prediction error filters in the $(f,x)$ domain are simulated by putting zeros along the dips picked in the $(f,k)$ spectrum. (3) Finally, missing traces are found by minimizing the filtered output in the least square sense. The advantage of this approach lies in its ability to handle multi-dips in both regularly and irregularly sampled traces.