# …in an orthogonal position

We have the hypothesis of having a watch with infinitely thin hour and minute hands (i.e. with continuous motion, and not jerky as usually, it happens). The ideal clock has a perfect mechanics, so that there is exactly 1 to 12 ratio between the angular speeds of the two hands. For convenience, we will measure the angles in sexagesimal degrees and time in minutes.

If we indicate with $$\omega_m$$ the angular speed of the minute hand and $$\omega_h$$ that of the hour hand, the angular motion laws of the two hands are obviously given by:

In these relations, $${\vartheta_m}$$ and $${\vartheta _h}$$ are the angles formed by the two hands with a predetermined reference direction, while $${\vartheta_{m,0}}$$ and $${\vartheta _{h,0}}$$ are the values of these angles at the initial time of our observation (when $$t=0$$).

For simplicity, but without losing in generalities, we can assume that the initial instant corresponds exactly at noon (or midnight), when the two hands are perfectly superimposed, and agree to measure the corners precisely from the direction of the twelve hours; in this way the two equations become:

$${\vartheta _m} = {\omega _m}t$$

$${\vartheta _h} = \frac{{{\omega _m}}}{{12}}t$$

In particular, if the clock is accurate, the two angular speeds are respectively $${\omega_m} = {6^o}/{\text{min}}$$ (for the minute hand) and $${\omega _h} = {0.5^o}/\min$$ (for that of the hours).

Therefore:

$${\vartheta _m} = 6t$$

$${\vartheta _h} = 0.5t$$

The question: from the beginning of the observation, after how long the two hands are for the first time in an orthogonal position?
$$6t – 0.5t = {90^o}$$
$$t = \frac{{90}}{{5.5}} \simeq 16\min {\text{ e }}20{\text{sec}}.$$