In mathematics, Siegel's theorem on integral points states that for a smooth algebraic curve C of genus g defined over a number field K, presented in affine space in a given coordinate system, there are only finitely many points on C with coordinates in the ring of integers O of K, provided g > 0.

The theorem was first proved in 1929 by Carl Ludwig Siegel and was the first major result on Diophantine equations that depended only on the genus and not any special algebraic form of the equations. For g > 1 it was superseded by Faltings's theorem in 1983.

History

In 1929, Siegel proved the theorem by combining a version of the Thue–Siegel–Roth theorem, from diophantine approximation, with the Mordell–Weil theorem from diophantine geometry (required in Weil's version, to apply to the Jacobian variety of C).

In 2002, Umberto Zannier and Pietro Corvaja gave a new proof by using a new method based on the subspace theorem.[1]

Effective versions

Siegel's result was ineffective (see effective results in number theory), since Thue's method in diophantine approximation also is ineffective in describing possible very good rational approximations to algebraic numbers. Effective results in some cases derive from Baker's method.

See also

References

  1. Corvaja, P. and Zannier, U. "A subspace theorem approach to integral points on curves", Compte Rendu Acad. Sci., 334, 2002, pp. 267–271 doi:10.1016/S1631-073X(02)02240-9
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.