1. The design of the book cover starts from a brief including the desired values. The peripheral display shows covers of similar books on the market, sorted according to color and composition. How it could work: Bibliographic metadata are used to retrieve matching book covers. The layout of the display is driven by analysis of color distribution.
2. The designers pick one of the existing covers from the wall to the tablet and start sketching on top of it. The display changes to a more typical mood board, showing inspirational images relating to the desired values. How it could work: Selection of inspirational images is done by matching desired values with user-generated image metadata. The matching algorithm is based on semantic rather than mere lexical proximity.
3. The sketching is now turning to the theme of flowers. The mood board reflects this orientation in its selection of inspirational images. How it could work: Here, the tablet sketching app becomes a key source of input. A combination of shape matching and semantic analysis of metadata is used to select inspirational images pertaining to a specific visual theme.
4. A first complete sketch of the new book cover is discussed. How it could work: The user interface of the sketching app provides controls for moving visual content between tablet and display. Hence, the display not only serves as a peripheral and inspirational mood board, but sometimes as a surface for focal collaboration.