Although the Elman network is so powerful that it can deal with a variety of language processings, there exist some short comings about its ability. For example, the original Elman net cannot always deal with a long distance dependency appropriately, which is a number agreement between nouns and verbs with many relative pronouns in a sentence. This limitation might cause from the constraints of its structure of the context and the hidden layer, which can preserve only one time previous state of the network. Here, we propose an extension of the Elman network and a generalized simple recurrent neurual network. The proposed networks could preserve the n-th generations of inner states. When the model processed the corpus consisted of many relative pronouns with multi-center embeddings structure, it could deal with the long distance number agreement adequately. These models can be regarded as a natural extension of the simple recurrent neural networks in order to deal with complex structures of language.