Abstract
The color sensor, fundamental in designing image filter algorithms for smartphones, is often developed on the basis of the demands of commercial imaging applications. The creation of a color palette is an essential step for designers when planning color schemes. In this study, we introduce a method for generating blended color imagery, called the ‘color image’ in some previous works,(8,9) to construct personalized palettes that cater to individual emotional needs. We modified the training approach of the autoencoder, incorporating data related to emotions into the training samples. This allowed us to establish a correspondence between color imagery and palettes on the basis of Kobayashi’s color image scale. Through linear interpolation calculations between different types of imagery, we derived the emotional coordinates of blended color imagery. Using these derived coordinates, we fed them into the trained autoencoder model to reconstruct the generated palette for blended color imagery. We conducted a visual evaluation experiment, and the results showed that the emotional conveyance of the generated blended color imagery palettes is consistent with human perception. Additionally, we presented two sample applications: emotional filters and background frameworks. We anticipate that the findings of this study can offer a new perspective for the development and application of the color sensor.
Original language | English |
---|---|
Pages (from-to) | 135-146 |
Number of pages | 12 |
Journal | Sensors and Materials |
Volume | 36 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2024 |
Keywords
- autoencoder
- color filter
- color image
- color palette
ASJC Scopus subject areas
- Instrumentation
- General Materials Science