1. The design of the book cover starts from a brief including the desired values. The peripheral display shows covers of similar books on the market, sorted according to color and composition. How it could work: Bibliographic metadata are used to retrieve matching book covers. The layout of the display is driven by analysis of color distribution.
2. The designers pick one of the existing covers from the wall to the tablet and start sketching on top of it. The display changes to a more typical mood board, showing inspirational images relating to the desired values. How it could work: Selection of inspirational images is done by matching desired values with user-generated image metadata. The matching algorithm is based on semantic rather than mere lexical proximity.
3. The sketching is now turning to the theme of flowers. The mood board reflects this orientation in its selection of inspirational images. How it could work: Here, the tablet sketching app becomes a key source of input. A combination of shape matching and semantic analysis of metadata is used to select inspirational images pertaining to a specific visual theme.
4. A first complete sketch of the new book cover is discussed. How it could work: The user interface of the sketching app provides controls for moving visual content between tablet and display. Hence, the display not only serves as a peripheral and inspirational mood board, but sometimes as a surface for focal collaboration.
Dynamic mood boards (2017)

The idea is that mood boards could change in accordance with the work being done.
A four-step photoboard illustrates the concept for the graphic-design task of designing a book cover.
This has been talked about at least since the 1990s, but to the best of my knowledge it has never been used widely. It is clear that image analysis and visualization technology is now ripe for implementing the concept of dynamic mood boards.
The work was done together with Mattias Arvola of Linköping University.

Jonas Löwgren
Professor Eksjö, Sweden