PageRank computes an iterative approximation to the principal eigenvector of the adjacency matrix for a graph, with a couple of modifications to make it so that a node with no outbound edges is assumed to link to everything. The latter property and the "boredom" factor that is built into the model ensure Ergodicity, this ensures the resulting Markov process is well defined and isn't periodic.
To prove it correct, you can go off and study how Markov chains work. There is a lot of work available in this space to build on.
But more loosely verify a particular PageRank-computed eigenvector, show that if you start on a random page with probability proportional to its page rank and transition to one of the targets it links to or retry this process if it has no outbound links, then the probability you land on any given page should _also_ match the page rank, just like the selection of the previous page.
A more rigorous proof that PageRank is correct proceeds by showing that transition probabilities of the Markov process it describes satisfy Detailed balance.
Proof that this eigenvector is useful for its stated purpose is another matter. In practice, Google uses many tweaks to the raw PageRank algorithm when deciding what to show users. These are used to fight spam, link-farms, compensate for known-limitations of the algorithm in that it favors older material, etc.
That said, the kind of Markov process it builds upon has been used for everything from rendering movies to the determining constants that were necessary to develop the first atomic bomb, and so the underlying principles are quite sound!