Nerdo-Journal Club: Decoding the Role of Multisensory Sequence Order in Memory Recognition
In their 2025 study published in Nature - Scientific Reports, Maack, Ostrowski, and Rose explore how the human brain encodes and retrieves the sequence order of multisensory information; specifically, how the order of auditory and visual stimuli affects memory recognition.
Why review this paper? Because it taps into a subtle but powerful question: Does the sequence in which we experience sensory inputs matter for how we remember them later? That question has implications far beyond the lab. Whether you're designing educational content, immersive experiences, consumer research protocols, or even product packaging that combines sound and visuals, understanding how the brain handles temporal order across senses can shape how we present, test, and interpret multisensory experiences.
In applied neuroscience and behavioral science, we often focus on what people perceive or remember. This paper reminds us to also ask when and in what order—and that might be the key to designing more effective tools for learning, communication, or consumer engagement.
Maack, M.C., Ostrowski, J. & Rose, M. The order of multisensory associative sequences is reinstated as context feature during successful recognition. Sci Rep 15, 18120 (2025). https://doi.org/10.1038/s41598-025-02553-3
🧠 What They Did and Why It Matters
The study by Maack, Ostrowski, and Rose asked a deceptively simple question: Does the order in which we encounter sights and sounds shape how we remember them later? And if so, does the brain actually "replay" that order during recall?
To explore this, the researchers recruited 32 participants and exposed them to pairs of naturalistic stimuli—a sound and an image, like hearing a frog croak after seeing a ship, or vice versa. During this encoding phase, the critical variable was the order: sometimes participants heard the sound first, other times they saw the image first.
Later, in the recognition phase, participants were shown the same pairings again—but this time presented simultaneously—and were asked to judge whether they'd seen or heard the combination before. Importantly, this part tested whether people could successfully recognize the pair regardless of sequence. But behind the scenes, the researchers were asking a deeper question: does the brain quietly remember the original order, even when it’s no longer relevant to the task?
To find out, they recorded participants' brain activity using EEG and applied a technique called multivariate pattern analysis (MVPA)—a machine learning approach that looks for hidden patterns in the brain data. Specifically, they wanted to see if they could detect neural traces of the original order (image-sound vs. sound-image) during successful memory recognition.
And they could.
The key takeaway? Even though participants weren’t explicitly asked to recall the order, the brain appeared to reinstantiate it—replaying the original sequence during memory retrieval. This suggests that temporal context, including the order of sensory inputs, is automatically and deeply embedded in our memories, even when it’s not consciously needed.
Insights from Our SSP 2024 Poster with L’Oréal
At the 2024 Society for Sensory Professionals meeting, we teamed up with L’Oréal to explore how consumers experience cosmetic products—not just through what they say, but also through what they implicitly associate. Our poster, “Decoding the Multisensory Dynamics in Cosmetics,” dug into how color, fragrance, and application together shape emotional and sensory perception of skin creams, balms, and moisturizers.
We ran a holistic, multi-stage evaluation (n=80 per product), where participants experienced each product through a sequence of:
Visual exposure
Fragrance exposure
Application
A holistic experience combining all three
Crucially, we added implicit association tests (IATs) at each stage to probe subconscious emotional and descriptive associations—revealing benefits beyond what people might explicitly report.
Research poster presented at Society for Sensory Professionals 2024, https://www.sensorysociety.org/meetings/archives/2024Conference/Pages/default.aspx
What We Found
Fragrance Dominates: While all products were seen as hydrating and smoothing, fragrance emerged as the biggest driver of emotional impact, especially when it matched well with the product’s color.
Color + Fragrance Synergy: The most positively perceived product wasn’t just liked, it also aligned visual, olfactory, and application cues in a way that drove strong, consistent associations.
Implicit Adds Depth: Products with congruent sensory cues generated richer implicit associations compared to a no color, unfragranced control, even when explicit liking was similar.
How This Connects to Multisensory Memory Research
Maack et al. (2025)showed that the order and integration of multisensory cues (like sight and sound) are encoded and later reinstated in memory—even when not consciously needed. In our cosmetic study, we essentially tested how layering sensory cues over time (visual, then fragrance, then touch) changes consumer perception and memory of a product.
If memory encoding is sensitive to multisensory sequence, as Maack and colleagues suggest, then product developers should think not just about what sensory cues are present but when and how they’re experienced. The act of applying a product isn’t just about function, it’s a temporal narrative of sensations.
This suggests that for:
Product Design: Consider the sequence of sensory interactions. The first impression (visual) sets the stage, but congruency with later cues (like scent and feel) drives trust, emotional resonance, and recall.
Consumer Testing: Implicit tools like IATs capture emotional shifts across sensory stages, something traditional surveys might miss.
Brand Storytelling: Create rituals that play to the brain’s ability to bind multisensory sequences into a cohesive, memorable experience.
🧪 A Closer Look: What Worked and What’s Worth Questioning
One of the most compelling things about the Maack et al. study is its innovative use of methods. By combining EEG (a technique that captures real-time brain activity) with MVPA (a machine learning approach that teases apart complex neural patterns), the researchers were able to peer into how the brain encodes and retrieves the order of sensory experiences, something traditional behavioral methods would struggle to detect. It’s a fresh take on an old question: how do we remember not just what happened, but how it unfolded?
Another strength is the use of naturalistic stimuli with real-world images and sounds that better reflect everyday experiences. That may sound small, but it’s important in this type or research. A lot of neuroscience research uses abstract or highly controlled stimuli that don’t map well onto real life, potentially not evoking more realistic responses. Here, the findings feel a step closer to being usable or at least applicable, whether in education, product design, or marketing.
That said, like any study, there are caveats. The sample size (32 participants) is pretty typical for this kind of research but still limits generalizability when you think about broader consumer research. We can’t assume these results would hold up across different ages, cultures, or cognitive profiles. And so when designing for consumer research, this would need to be considered.
Also, while the stimuli were more realistic than random (or rather, scientifically controlled) tones and shapes, they were still relatively simple: an image and a sound, one after the other. So it does raise the question: what happens when the stimuli are more complex or emotionally rich? Does the brain still reinstate the original sensory sequence when recalling something like a song, a fragrance, or a memory tied to a personal moment?
These limitations don’t undermine the study’s value, rather they open the door for future research to push further, deeper, and into more applied spaces.
💡 Why This Matters: Real-World Implications
The idea that the brain not only stores what we see and hear, but also how and in what order those inputs arrive, has some fascinating implications outside the lab.
In education and training, for example, this research suggests we should think beyond just mixing media. It's not enough to have both visuals and audio in a lesson; the sequence matters. Should you show the diagram first, then explain it? Or narrate first and follow with an image? Depending on the goal, concept clarity, emotional resonance, or long-term retention, this study suggests that how you order those inputs could shape how well learners remember them.
In the clinical world, especially in neurorehabilitation, the findings could inform new strategies for people with memory disorders or damage. If the brain naturally stores and replays the sequence of sensory inputs, then therapies that reintroduce this order, through guided, multisensory reactivation, might support more effective memory retrieval. It's a subtle but powerful shift from just triggering memories to reconstructing the sensory journey of those memories.
Even in tech and design, there’s something to take away. Human-computer interfaces (whether apps, AR experiences, or smart devices) often rely on multisensory feedback. But few are designed with a deep understanding of how the order of those cues shapes user engagement. If the brain is more likely to recognize and respond to familiar sensory sequences, then building interfaces that reflect those natural patterns could make them more intuitive, memorable, and emotionally engaging.
🔭 What’s Next? Questions Still Waiting for Answers
This study opens the door to a number of exciting research questions, ones that could take our understanding of multisensory memory from fascinating to truly transformative. Especially in product R&D.
First, there’s the question of modality. This research focused on vision and sound, but what about the other senses, like touch or smell (as we did in our SSP study)? Would we still see this kind of sequence reinstatement if someone smelled a rose and then touched velvet, versus the other way around? These modalities are especially relevant in industries like beauty, food, and consumer tech, where tactile and olfactory experiences play a major role.
Then there’s emotion. This study used neutral, naturalistic stimuli. But life isn’t always neutral. What happens when the stimuli are emotionally charged? A childhood song followed by a parent’s voice, or the smell of sunscreen after the sight of a beach. Do those emotional layers make the sequence more “sticky” in memory? Or does emotion change the way the brain binds the sequence together?
And finally, there’s time. This study looked at short-term recognition. But what about long-term memory? Does the brain still preserve the sensory order of an experience after days, weeks, or months? And how does repeated exposure shape that memory? Does it strengthen the sequence, blur it, or replace it?
This study sheds light on the intricate ways our brains handle the order of multisensory information, emphasizing that not just the content but the sequence of sensory experiences plays a crucial role in memory recognition. By demonstrating that the brain reinstates the original modality sequence during retrieval, this opens avenues for enhancing learning, rehabilitation, and user experience design through strategic sequencing of multisensory information.
🧠 From Research to Real-World
This study is a perfect example of why we need more bridges between neuroscience and the real world. It’s not just about understanding memory or perception in theory. It’s about asking, what does this mean for how we design, test, and deliver experiences to people?
That’s where I come in.
I help teams bring these insights to life whether that’s through leading targeted journal club sessions that unpack research like this for your internal teams, or by applying the findings directly to your work in product development, sensory testing, UX, branding, or consumer research.
Want to explore how multisensory sequence might impact your product experience or research design? Curious how to integrate tools like EEG, IATs, or behavioral testing without overpromising on what they can do? Let’s talk.
Because science should never sit on a shelf. It should shape smarter, more human-centered decisions.