In this paper, we consider the following system of differential equations:
\[
θ
˙
=
ω
+
Θ
(
θ
,
z
)
,
z
˙
=
A
z
+
f
(
θ
,
z
)
,
\dot \theta = \omega + \Theta (\theta ,z), \quad \dot z = Az + f(\theta ,z),
\]
where
θ
∈
C
m
\theta \in {C^m}
,
ω
=
(
ω
1
,
…
,
ω
m
)
∈
R
m
\omega = ({\omega _1}, \ldots ,{\omega _m}) \in {R^m}
,
z
∈
C
n
z \in {C^n}
,
A
A
is a diagonalizable matrix,
f
f
and
Θ
\Theta
are analytic functions in both variables and
2
π
2\pi
-periodic in each component of the vector
θ
,
Θ
=
O
(
|
z
|
)
\theta ,\Theta = O(|z|)
and
f
=
O
(
|
z
|
2
)
f = O(|z{|^2})
as
z
→
0
z \to 0
. We study the normal form of this system of the equations and prove that this system can be transformed to a system of linear equations
\[
θ
˙
=
ω
,
z
˙
=
A
z
\dot \theta = \omega , \quad \dot z = Az
\]
by an analytic transformation provided that the eigenvalues of
A
A
and the frequency
ω
\omega
satisfy certain small-divisor conditions.