YYÜ GCRIS Basic veritabanının içerik oluşturulması ve kurulumu Research Ecosystems (https://www.researchecosystems.com) tarafından devam etmektedir. Bu süreçte gördüğünüz verilerde eksikler olabilir.
 

Bayesian Regularized Neural Networks for Small N Big P Data

dc.contributor.author Okut, Hayrettin
dc.date.accessioned 2025-05-10T17:39:39Z
dc.date.available 2025-05-10T17:39:39Z
dc.date.issued 2016
dc.department T.C. Van Yüzüncü Yıl Üniversitesi en_US
dc.department-temp [Okut, Hayrettin] Yuzuncu Yil Univ, Fac Agr, Biometry & Genet Branch, Van, Turkey; [Okut, Hayrettin] Wake Forest Univ, Bowman Gray Sch Med, Ctr Diabet Res, Ctr Genom & Personalized Med Res, Winston Salem, NC 27109 USA en_US
dc.description.abstract Artificial neural networks (ANN) mimic the function of the human brain and they have the capability to implement massively parallel computations for mapping, function approximation, classification, and pattern recognition processing. ANN can capture the highly nonlinear associations between inputs (predictors) and target (responses) variables and can adaptively learn the complex functional forms. Like other parametric and nonparametric methods, such as kernel regression and smoothing splines, ANNs can introduce overfitting (in particular with highly-dimensional data, such as genome wide association - GWAS-, microarray data etc.) and resulting predictions can be outside the range of the training data. Regularization (shrinkage) in ANN allows bias of parameter estimates towards what are considered to be probable. Most common techniques of regularizations techniques in ANN are the Bayesian regularization (BR) and the early stopping methods. Early stopping is effectively limiting the used weights in the network and thus imposes regularization, effectively lowering the Vapnik-Chervonenkis dimension. In Bayesian regularized ANN (BRANN), the regularization techniques involve imposing certain prior distributions on the model parameters and penalizes large weights in anticipation of achieving smoother mapping. en_US
dc.description.woscitationindex Book Citation Index – Science
dc.identifier.doi 10.5772/63256
dc.identifier.endpage 48 en_US
dc.identifier.isbn 9789535127055
dc.identifier.isbn 9789535127048
dc.identifier.scopusquality N/A
dc.identifier.startpage 27 en_US
dc.identifier.uri https://doi.org/10.5772/63256
dc.identifier.uri https://hdl.handle.net/20.500.14720/14963
dc.identifier.wos WOS:000398803000003
dc.identifier.wosquality N/A
dc.institutionauthor Okut, Hayrettin
dc.language.iso en en_US
dc.publisher intech Europe en_US
dc.relation.publicationcategory Kitap Bölümü - Uluslararası en_US
dc.rights info:eu-repo/semantics/openAccess en_US
dc.subject Artificial Neural Network en_US
dc.subject Bayesian Regularization en_US
dc.subject Shrinkage en_US
dc.subject P >> N en_US
dc.subject Prediction Ability en_US
dc.title Bayesian Regularized Neural Networks for Small N Big P Data en_US
dc.type Book Part en_US

Files