Open Access   Article Go Back

A Framework for Lung Cancer Survivability Prediction Using Optimized-Deep Neural Network Classification and Regression technique

Pradeep K.R1 , Naveen N.C2

Section:Survey Paper, Product Type: Journal Paper
Volume-07 , Issue-13 , Page no. 57-66, May-2019

CrossRef-DOI:   https://doi.org/10.26438/ijcse/v7si13.5766

Online published on May 14, 2019

Copyright © Pradeep K.R, Naveen N.C . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Pradeep K.R, Naveen N.C, “A Framework for Lung Cancer Survivability Prediction Using Optimized-Deep Neural Network Classification and Regression technique,” International Journal of Computer Sciences and Engineering, Vol.07, Issue.13, pp.57-66, 2019.

MLA Style Citation: Pradeep K.R, Naveen N.C "A Framework for Lung Cancer Survivability Prediction Using Optimized-Deep Neural Network Classification and Regression technique." International Journal of Computer Sciences and Engineering 07.13 (2019): 57-66.

APA Style Citation: Pradeep K.R, Naveen N.C, (2019). A Framework for Lung Cancer Survivability Prediction Using Optimized-Deep Neural Network Classification and Regression technique. International Journal of Computer Sciences and Engineering, 07(13), 57-66.

BibTex Style Citation:
@article{K.R_2019,
author = {Pradeep K.R, Naveen N.C},
title = {A Framework for Lung Cancer Survivability Prediction Using Optimized-Deep Neural Network Classification and Regression technique},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {5 2019},
volume = {07},
Issue = {13},
month = {5},
year = {2019},
issn = {2347-2693},
pages = {57-66},
url = {https://www.ijcseonline.org/full_spl_paper_view.php?paper_id=1077},
doi = {https://doi.org/10.26438/ijcse/v7i13.5766}
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v7i13.5766}
UR - https://www.ijcseonline.org/full_spl_paper_view.php?paper_id=1077
TI - A Framework for Lung Cancer Survivability Prediction Using Optimized-Deep Neural Network Classification and Regression technique
T2 - International Journal of Computer Sciences and Engineering
AU - Pradeep K.R, Naveen N.C
PY - 2019
DA - 2019/05/14
PB - IJCSE, Indore, INDIA
SP - 57-66
IS - 13
VL - 07
SN - 2347-2693
ER -

           

Abstract

Lung cancer disease is the most widely recognized deadly disease in the world for loss of life. Throughout this research, Electronic Health Records (EHRs) textual data are investigated and survivability rates for lung cancer affected patients are predicted. If the patients are survivable more than one year, chemotherapy treatment can be started for those patients. This research paper examines an effective Batch Size-Optimizer based Deep Neural Network (Op-DNN) classifier framework model, which is developed to predict the patient’s survivability based on status dead or alive. Considering only the patients who are alive, prediction is done to know how many months the patients will survive by Op-DNN regression technique. Here the textual data set is classified and processed in batches for each iteration. The errors generated from the original classification of the first batch size is fed back to the Op-DNN algorithm for further iterations with the reduced error loss that are free from underfitting and overfitting. The proposed method is compared with various parameters for Machine learning classifier algorithms demonstrating that the Op-DNN model has achieved better accuracy.

Key-Words / Index Term

Lung Cancer, Diabetes, Survivable Rate, Artificial Neural network, DNN, Op-DNN, Classifier, SVM, NBs, C4.5, Optimizer, Adam, Relu, Epoch, Batchsize,Op-DNN Regression

References

[1] K, Saravanan, and Sasithra S. “Review on Classification Based on Artificial Neural Networks.”, The International Journal of Ambient Systems and Applications, vol. 2, no. 4, 2014, pp. 11–18.
[2] Jiang, Fei, et al. “Artificial Intelligence in Healthcare: Past, Present and Future.” , Stroke and Vascular Neurology, vol. 2, no. 4, 2017, pp. 230–243.
[3] Hughes, Mark, et al. "Medical text classification using convolutional neural networks.", Stud Health Technol Inform 2352017, pp. 246-250.
[4] Kollias, Dimitrios, et al. “Deep Neural Architectures for Prediction in Healthcare”, Complex & Intelligent Systems, vol. 4, no. 2, 2017,pp. 119–131.
[5]Chon, Albert, Niranjan Balachandar, and Peter Lu. "Deep convolutional neural networks for lung cancer detection." , Tech. rep., Stanford University,2017.
[6] Sarwar, Abid, and Vinod Sharma. “Comparative Analysis of Machine Learning Techniques in Prognosis of Type II Diabetes.” Ai & Society, vol. 29, no. 1, 2013, pp. 123–129.,
[7] Rodrigo, Hansapani, and Chris P. Tsokos. “Artificial Neural Network Model for Predicting Lung Cancer Survival” , Journal of Data Analysis and Information Processing, vol. 05, no. 01, 2017, pp. 33–47.
[8] Kuan, et al. “Deep Learning for Lung Cancer Detection: Tackling the Kaggle Data Science Bowl 2017 Challenge.”, ArXiv.org, 26 May 2017
[9] Schmidhuber, and Juergen. “Deep Learning in Neural Networks: An Overview.” ArXiv.org, 8 Oct. 2014
[10]Agrawal, Shikha, and Jitendra Agrawal. “Neural Network Group. Journal of Clinical Oncology, Vol 12, no. 03, 1994, pp. 601- 607. Science, vol. 60, 2015, pp. 769–774.
[11]Burt, Jeremy R., et al. "Deep learning beyond cats and dogs: recent advances in diagnosing breast cancer with deep neural networks.", The British journal of radiology 91.1089 ,2018, p.20170545.
[12] Bengio, Yoshua, and Yann LeCun. "Scaling learning algorithms towards AI.", Large-scale kernel machines vol.34, no.05, 2007, pp.1-41.
[13] Huerta, E. A., et al. "Real-time regression analysis with deep convolutional neural networks.", arXiv, 2018.
[14] S. Yan, H. Wang, X. Tang, and T. S. Huang, “Learning auto structured regressor from uncertain non negative labels,” in CVPR, 2007, pp. 1–8.
[15]Szegedy, Christian, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed, Dragomir Anguelov, Dumitru Erhan, Vincent Vanhoucke, and Andrew Rabinovich. "Going deeper with convolutions." ,In Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1-9. 2015.
[16] Stephane Lathuiliere, Pablo Mesejo, Xavier Alameda-Pineda, Member IEEE, and Radu Horaud, “A Comprehensive Analysis of Deep Regression”, arxiv:1803.0845. vol, 22 Mar 2018.
[17] Bae, K.T., Kim, C.J., “An Agricultural Estimate Price Model of Artificial Neural Network by Optimizing Hidden Layer”, Journal of Intelligent Information Systems, vol,12, 2016, pp.161-169.
[18]Ki-Young Lee1, Kyu-Ho Kim1,*, Jeong-Jin Kang2, Sung-Jai Choi3, Yong-Soon Im4, Young-Dae Lee5, Yun-Sik Lim6, “Comparison and Analysis of Linear Regression & Artificial Neural Network”, International Journal of Applied Engineering Research ISSN 0973- 4562, Vol 12, no. 20 , 2017, pp. 9820-9825.
[19]Loprinzi, C., Laurie, J., Wieand, H., Krook, J., Novotny, P., & Kugler, J.,” Prospective evaluation of prognostic variables from patient-completed questionnaires”, North Central Cancer Treatment
{20]Pradeep KR, Naveen NC.,” Lung Cancer Survivability Prediction based on Performance Using Classification Techniques of Support Vector Machines, C4. 5 and Naive Bayes Algorithms for Healthcare Analytics”, Procedia computer science,132, 31 Dec 2018, pp.412-420.
[21]Cox, Victoria. “Exploratory Data Analysis.” SpringerLink, Apress, Berkeley, CA, 2017.
[22]Ketkar N,” Introduction to Keras. In: Deep Learning with Python. Apress”, Berkeley, CA, 2017.
[23]Schmidt-Hieber, J., “Nonparametric regression using deep neural networks with ReLU activation function”, arXiv,2018
[24]Bahar, Parnia, et al. “Empirical Investigation of Optimization Algorithms in Neural Machine Translation.” The Prague Bulletin of Mathematical Linguistics, vol. 108, no. 1, 2017, pp. 13–25.
[25]Kingma, Diederik, and Jimmy Ba, "Adam: a method for stochastic optimization”, arXiv 2015.
[26]Srivastava, Nitish, et al. "Dropout: a simple way to prevent neural networks from overfitting." The Journal of Machine Learning Research 15.1, 2014, pp. 1929-1958.