Please use this identifier to cite or link to this item: http://hdl.handle.net/10263/7168
Title: Efficient Automatic Optimization of Neural Network Architecture
Authors: Pathak, Harsharaj
Keywords: Supervised Machine Learning
Multilayer Perceptron
Issue Date: 2020
Publisher: Indian Statistical Institute, Kolkata
Citation: 48p.
Series/Report no.: Dissertation;;2020-14
Abstract: Neural Networks are at the heart of deep Learning Frame works which have yielded excellent results in various complex problem domains. But the design of neural network architecture is a challenging task. Judicious selection of network architecture and manual tuning of network parameters is a tedious and time consuming process. There has been a substantial e ort to automate the process of neural network design using various heuristic algorithms. Evolutionary algorithm are amongst the most successful methods to automate the network architecture search process. But these algorithms are very computation intensive. Thus we try to explore a technique that could lead to faster evolutionary algorithms to nd optimal neural network architecture.We also do a survey of various alternative methods.
Description: Dissertation under the supervision of Prof. Ashish Ghosh, MIU
URI: http://hdl.handle.net/10263/7168
Appears in Collections:Dissertations - M Tech (CS)

Files in This Item:
File Description SizeFormat 
HarsharajPathakDissertationCS1819_CD.pdf1.11 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.