Publicador de contenidos

Aplicaciones anidadas

tit-curso-25-26

Seminarios DATAI. Curso 2025-2026

Aplicaciones anidadas

foto-Milan Stehlík

Milan Stehlík


 

seminario-25-26-Revolution

Revolution in neural networks: introducing spocu

11/09/2025 Milan Stehlík, Universidad de Valparaíso. Chile.

PDF

Abstract-Milan Stehlík

In this talk, I will introduce our adaptive transfer function SPOCU, which I developed with collaborators to address the insufficiency of standard transfer functions in properly processing real data flows. SPOCU is a revolutionary improvement in the speed and innovation of adaptation strategies, filling a gap in existing technology. Activation functions are crucial in deep learning for extracting complex data patterns, and traditional functions like ReLU, Selu, among others, have limitations in adapting to specialized tasks. Standard transfer functions have limitations in complex setups, thus necessitating the development of robust approaches like large-scale self-normalizing neural networks. To address this, we propose a novel trainable adaptive activation function based on SPOCU construction.

Dynamical networks face challenges with big and irregular data. Optimal activation function selection and hyperparameter management are crucial. The SPOCU transfer function offers flexibility and superior performance in machine learning tasks. Experimental results show improvements in cancer diagnosis and pollutant adsorption dynamics. Developing adaptive algorithms for hyperparameter selection is essential, and our milestone result of DExPSO is giving a chance to avoid recurrent premature failures of standard neural networks. We showed how Cobetia bacteria adaptation to large temperature ranges can be modeled by an SPOCU-based neural network.

During the talk, we will discuss how the SPOCU prototype adaptive function has been created and explore new ideas for optimizing hyperparameters in adaptive transfer functions like SPOCU for real-world data flows, improving methodologies in different application areas.

SEMINARIOS-ENÑACES-CURSOS-ANTERIORES