An efficient generalized hybrid constructive (GHC) learning algorithm for multioutput radial basis function (RBF) networks is proposed to obtain a compact network with good generalization capability. By this algorithm, one can train the adjustable parameters and determine the optimal network structure simultaneously. First, an initialization method based on the growing and pruning algorithm is utilized to select the important initial hidden neurons and candidate ones. Then, by introducing a generalized hidden matrix, a structured parameter optimization algorithm is presented to train multioutput RBF network with fixed size, which combines Levenberg-Marquardt (LM) algorithm with least-square method together. Beginning from an appropriate number of hidden neurons, new neurons chosen from the candidates are added one by one each time when the training entraps into local minima. By incorporating an improved incremental constructive scheme, the training is built on previous results after adding new neurons such that the GHC learning algorithm avoids a trial-and-error procedure. Furthermore, based on the improved computation for LM training, the memory limitation problem is solved. The computational complexity analysis and experimental results demonstrate that better performance is efficiently achieved by this algorithm.