Please use this identifier to cite or link to this item:
http://hdl.handle.net/10263/7261
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Varshney, Ankit | - |
dc.date.accessioned | 2022-02-02T05:09:58Z | - |
dc.date.available | 2022-02-02T05:09:58Z | - |
dc.date.issued | 2019-07 | - |
dc.identifier.citation | 31p. | en_US |
dc.identifier.uri | http://hdl.handle.net/10263/7261 | - |
dc.description | Dissertation under the supervision of Prof. Nikhil R. Pal | en_US |
dc.description.abstract | In this thesis, we develop three new methods for feature selection with Multi Layer Perceptron(MLP) neural networks. In each method, we use a two-step approach. First, we train a MLP network for a given dataset. Second, we introduce feature selector variables and form an optimization problem based on some penalty on these feature selector variables and some measure of redundancy. Then, we optimize the problem using gradient descent method to find nearly optimal values of the feature selector variables while keeping the weights of the MLP networks fixed. First method, which we call Feature Selection with MLP using Approximate L0-norm and Global Redundancy Control (FSMLP-AL-GRC) uses penalty based on an approximate L0-norm and global redundancy, i.e., redundancy that is calculated with features values, without considering the class information. For second method, we first define a new redundancy measure that uses class label information while calculating redundancy, we call it class-level redundancy. This method make use of class level redundancy measure along with an approximate L0-norm based penalty. We call it Feature Selection with MLP using Approximate L0-norm and Class-level Redundancy Control (FSMLP-AL-CRC). Last method is a variant of method two. Here, we replace each feature selector variable with some non-linear bounded function that always lies between 0 and 1, this function act as feature attenuating gates. We call this method Gated Feature Selection with MLP using Class-level Redundancy Control (Gated-FSMLP-CRC). We test these methods experimentally on different data sets. We also present results on Sonar data using method Gated-FSMLP-CRC without keeping the weights of MLP fixed during the learning process. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Indian Statistical Institute, Kolkata | en_US |
dc.relation.ispartofseries | Dissertation;;2019:13 | - |
dc.subject | Multi-layer perceptrons | en_US |
dc.subject | Global Redundancy Control | en_US |
dc.title | Feature SelectionWith Multi Layer Perceptron Using Approximate L0-norm and Redundancy Control | en_US |
dc.type | Other | en_US |
Appears in Collections: | Dissertations - M Tech (CS) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
CS1710_thesis_Ankit_Varshney.pdf | 716.26 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.