Publication | Closed Access
FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
97
Citations
21
References
2018
Year
Unknown Venue
Convolutional Neural NetworkEngineeringMachine LearningImage AnalysisData SciencePattern RecognitionSparse Neural NetworkRelu NetworksLinear UnitVideo TransformerActivation FunctionData AugmentationMachine VisionComputer ScienceMedical Image ComputingDeep LearningNeural Architecture SearchModel CompressionComputer VisionDeep Neural Networks
Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. However, because of the zero-hard rectification, ReLU networks lose the benefits from negative values. In this paper, we propose a novel activation function called flexible rectified linear unit (FReLU) to further explore the effects of negative values. By redesigning the rectified point of ReLU as a learnable parameter, FReLU expands the states of the activation output. When a network is successfully trained, FReLU tends to converge to a negative value, which improves the expressiveness and thus the performance. Furthermore, FReLU is designed to be simple and effective without exponential functions to maintain low-cost computation. For being able to easily used in various network architectures, FReLU does not rely on strict assumptions by self-adaption. We evaluate FReLU on three standard image classification datasets, including CIFAR-10, CIFAR-100, and ImageNet. Experimental results show that FReLU achieves fast convergence and competitive performance on both plain and residual networks.
| Year | Citations | |
|---|---|---|
Page 1
Page 1