Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3638584.3638604acmotherconferencesArticle/Chapter ViewAbstractPublication PagescsaiConference Proceedingsconference-collections
research-article

Activation Function: Absolute Function,One Function Behaves more Individualized

Published: 14 March 2024 Publication History

Abstract

Inspired by natural world mode, a activation function is proposed. It is absolute function. According to probability principle, nature word is normal distribution. Stimulation which happens frequently is low value, it is shown around zero in figure 1. Stimulation which happens accidentally is high value, it is shown far away from zero in figure 1. So the high value is the big stimulation, which is individualization. Through test on mnist dataset and fully-connected neural network and convolutional neural network, some conclusions are put forward. The line of accuracy of absolute function is a little shaken that is different from the line of accuracy of relu and leaky relu. The absolute function can keep the negative parts as equal as the positive parts, so the individualization is more active than relu and leaky relu function. In order to generalization, the individualization is the reason of shake, the accuracy may be good in some set and may be worse in some set. The absolute function is less likely to be over-fitting. The batch size is small, the individualization is clear, vice versa. If you want to change the individualization of absolute function, just change the batch size. Through one more test on mnist and autoencoder, It is that the leaky relu function can do classification task well, while the absolute function can do generation task well. Because the classification task needs more universality and generation task need more individualization.

References

[1]
Ian Goodfellow, Yoshua Bengio, Aaron Courville, 2016, “Deep Learning”, MIT Press, 188-189
[2]
Sheldon M.Ross, 2009, “A First Course in Probability”,Pearson Prentice Hall, 198-199
[3]
Tom Mitchell, 2008, “Machine Learning”, McGraw-Hill Science, 70-71
[4]
François Chollet, 2017, ”Deep Learning with Python”, Manning Publications, 164-165
[5]
Jin-xin Wei, Qun-ying Ren, 2020, A Functionally Separate Autoencoder, Proceedings of the Future Technologies Conference (FTC) Volume 1. 553-560.
[6]
Tensorflow Tutorials and Apis, 2022, https://tensorflow.google.cn/learn.
[7]
Aurélien Géron, 2017, ”Hands-On Machine Learning with Scikit-Learn and TensorFlow”, O'Reilly Media, 95

Index Terms

  1. Activation Function: Absolute Function,One Function Behaves more Individualized

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CSAI '23: Proceedings of the 2023 7th International Conference on Computer Science and Artificial Intelligence
    December 2023
    563 pages
    ISBN:9798400708688
    DOI:10.1145/3638584
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 14 March 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Keyword: activation function
    2. absolute function
    3. abstract network
    4. concrete network
    5. individualization
    6. over-fitting
    7. stimulation
    8. universality

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    CSAI 2023

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 9
      Total Downloads
    • Downloads (Last 12 months)9
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 24 Sep 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media