Automating pneumonia diagnosis from X-ray images could significantly improve patient diagnosing outcomes. A major challenge is that disease information (features) must be extracted directly from the image backgrounds. Motivated by recent advances in Convolutional Neural Network (CNN), we propose a hierarchical weighting deep learning model, ChestWNet, that combines DenseNet and transfer learning to detect and localize thoracic diseases from chest x-rays. Hierarchical weighting networks are designed to assign scores reflecting the importance of specific pixels (regions), and learning weights at pixel-, region-, and image-levels, jointly learning these hierarchical weighting networks and the image classification network in an end-to-end manner. Chest X-ray datasets are customized to solve the unbalancing label problem in these datasets. Extensive experiments show that ChestWNet significantly outperforms other established prediction methods, and can also be applied to similar scenarios with fixed point-of-interest regions in images.