Color Palette Generation of Mixed Color Images Using Autoencoder

Tzren Ru Chou*, Jie Yun Shao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)


The color sensor, fundamental in designing image filter algorithms for smartphones, is often developed on the basis of the demands of commercial imaging applications. The creation of a color palette is an essential step for designers when planning color schemes. In this study, we introduce a method for generating blended color imagery, called the ‘color image’ in some previous works,(8,9) to construct personalized palettes that cater to individual emotional needs. We modified the training approach of the autoencoder, incorporating data related to emotions into the training samples. This allowed us to establish a correspondence between color imagery and palettes on the basis of Kobayashi’s color image scale. Through linear interpolation calculations between different types of imagery, we derived the emotional coordinates of blended color imagery. Using these derived coordinates, we fed them into the trained autoencoder model to reconstruct the generated palette for blended color imagery. We conducted a visual evaluation experiment, and the results showed that the emotional conveyance of the generated blended color imagery palettes is consistent with human perception. Additionally, we presented two sample applications: emotional filters and background frameworks. We anticipate that the findings of this study can offer a new perspective for the development and application of the color sensor.

Original languageEnglish
Pages (from-to)135-146
Number of pages12
JournalSensors and Materials
Issue number1
Publication statusPublished - 2024


  • autoencoder
  • color filter
  • color image
  • color palette

ASJC Scopus subject areas

  • Instrumentation
  • General Materials Science


Dive into the research topics of 'Color Palette Generation of Mixed Color Images Using Autoencoder'. Together they form a unique fingerprint.

Cite this