Session Program

 

  • 11 July 2017
  • 04:00PM - 06:00PM
  • Room: Catalana
  • Chairs: Radko Mesiar, Javier Montero and Humberto Bustince

Generalizations of the notion of aggregation function

Abstract - In this contribution, we discuss the role and potential of pre-aggregation functions. This class of functions has the same boundary conditions as aggregation functions, but differs in the constraints related to function increase. Specifically, monotonicity is replaced by the so-called directional monotonicity. With this work, we attempt to shed light on the relationship between aggregation and pre-aggregation functions, as well as to discuss some construction methods of pre-aggregation functions.
Abstract - In this work we propose a new generalization of the notion of monotonicity, the so-called ordered directionally monotonicity. With this new notion, the direction of increasingness or decreasingness at a given point depends on that specific point, so that it is not the same for every value on the domain of the considered function.
Abstract - The aim of this paper is to introduce the concept of pre-t-conorms, based on the notion of pre-aggregation function, which was introduced by Lucca et al. as an ``aggregation'' concept that it is not monotonic in all its domain. We also study the concept of light pre-t-conorms, which are non necessarily associative commutative functions with neutral element \$e = 0\$. We present some properties of (light) pre-t-conorms and the classes of (light) pre-t- conorms, showing interesting examples. Finally, we present an application of pre-conorms and negations for defining directional fuzzy implication functions. We notice that the introduction of pre-t-conorms allows applications where the full monotonicity is not required (as in classification problems) and the light pre-t-conorms can be used in applications that do not require the associativity property (as in image processing and decision making).
Abstract - In this work we investigate a new family of fusion functions called {$\backslash$}textit\{internal\} fusion functions. The main characteristic of these functions is the fact that the output always corresponds to some of the given inputs. We propose a construction method and we study whether internal functions constructed in this way also satisfy properties of aggregation functions. Finally, we apply internal functions in an example of a multi-class problem, where a set of matrices must be combine into a single representative collective matrix in order to obtain better classification rates.
Abstract - The aim of this work is to study the class of fuzzy implication obtained by a triple (I,I\_1,I\_2) of fuzzy implications. Thus, this paper discusses under which conditions such functions preserve the main properties of fuzzy implications. In addition, by conjugate fuzzy implications it is shown that a fuzzy implication can be preserved by action of an order automorphism. Finally, we introduce the family of fuzzy implications obtained by taking the extended classes of triple (I,I\_1,I\_2)-implications verifying both generalized properties, exchange principle and distributivity in addition to, their dual construction is also considered.
Abstract - Feature aggregation is a crucial step in many methods of image classification, like the Bag-of-Words (BoW) model or the Convolutional Neural Networks (CNN). In this aggregation step, usually known as spatial pooling, the descriptors of neighbouring elements within a region of the image are combined into a local or a global feature vector. The combined vector must contain relevant information, while removing irrelevant and confusing details. Maximum and average are the most common aggregation functions used in the pooling step. To improve the aggregation of relevant information without degrading their discriminative power for classification in this work we propose the use of Ordered Weighted operators. We provide an extensive evaluation that shows that the final result of the classification using OWA aggregation is always better than average pooling and better than maximum pooling when dealing with small dictionary sizes.