Drawn Together Through Visual Practice

Home > Other > Drawn Together Through Visual Practice > Page 23
Drawn Together Through Visual Practice Page 23

by Brandy Agerbeck


  Crilly, N., Blackwell, A., & Clarkson, J. (2006). Graphic elicitation: Using research diagrams as interview stimuli. Sage Visual Methods. London, UK: Sage Publications.

  Evans, J. (2008). Dual-processing accounts of reasoning, judgment and social cognition. Annual Review of Psychology, 59, 255–278.

  West, T. (2009). In the mind’s eye. New York, NY: Prometheus Books.

  Martin, R. (2009). The opposable mind. Boston, MA: Harvard Business Press.

  Gardner, H. (1999). Multiple intelligences: The theory into practice. New York, NY: Basic Books.

  Hitt, M., Keats, B., & DeMarie, S. (1998). Navigating in the new competitive landscape: Building strategic flexibility and competitive advantage in the 21st Century. Academy of Management Executive. 12(4), 22–42.

  Barrett, F. (1995). Creating appreciative learning cultures. Organizational Dynamics, 24(2), 36–49.

  Gergen, K. (2009). An invitation to social construction. Thousand Oaks, CA: SAGE Publications.

  Marcy, R., & Mumford, M. (2007). Social innovation: Enhancing creative performance through casual analysis. Creativity Research Journal, 19(2-3), 123–140.

  Meadows, D. H. (1999). Leverage points: Places to intervene in a system. (pp.1-19). Hartland, VT: The Sustainability Institute.

  Shotter, J. (1997). The social construction of our ‘inner’ lives. Department of Communication. University of New Hampshire. Retrieved from http://www.massey.ac.nz/~alock/virtual/inner.htm

  Gherardi, S., Nicolini, D., & Odella, F. (1998). Toward a social understanding of how people learn in organizations: The notion of situated curriculum. Management Learning, 29(3), 273–197.

  Joy, S. (2004). Innovation motivation: The need to be different. Creativity Research Journal, 16(2-3), 313–330.

  Rigorous Design of Visual Tools that Deepen Conversations and Spark New Insights

  Christine Martell

  When we get stuck in ideas, intervention with visuals can help us reach beyond and inspire us to new heights—especially if we remain curious about how others see the world.

  When a conversation starts with visuals, different kinds of insights emerge. Ask

  a question. Invite people to select images in response to the question. See the patterns and unique ideas emerge from the visual cards they choose. The responses to these pictures merge to form a new collective story.

  The power of images to create new insights is what led me to design the VisualsSpeak ImageSet in 2005. This curated, 200-image set has been used to facilitate conversations all over the world. In this article I share the rigorous multi-year process my team and I used to develop and test this tool. I include questions for you to answer if you are developing your own visual tools.

  Where it started

  In art school, I had the opportunity to do my work-study program in the library clipping collection, which is a set of images primarily taken from books and magazines used to serve as inspiration for artists working on projects. Over four years, I learned how to catalog images for other people to use while watching how students, faculty, and alumni interacted and searched for pictures in relationship to their ideas. I then amassed my own personal file cabinet of magazine clippings as I saw how powerful this could be for inspiration.

  Fast forward 20 years. I am getting a Masters degree in adult education with a specialization in training and development, with colleagues interested in doing things such as team-building, leadership development, and strategic visioning. My classes were filled with people who had experience in these processes, so school became a perfect incubator for designing tools.

  While I had been practicing and getting great results with creative methods in workshops and consulting for years, I didn’t know why these tools worked so well. The research and writing skills needed to get my degree also helped me to articulate much of the theoretical basis for why creative processes based on visuals were so effective in reaching people.

  Design guidelines

  Graduate school inspired me to come up with a standardized visual tool that was effective in the workplace. With hundreds of variables to consider, I needed some parameters to work within.

  Drawing from a reservoir of facilitation experience along with conversations with potential customers and business professionals, I started to come up with a blueprint for a successful visually based tool. Below are some of the need-to-haves that were established for the product based on these experiences and conversations.

  Christine Martell • Rigorous Design of Visual Tools

  Audience

  Professionals with some experience with groups in a work setting

  Broader audience than only certified facilitators

  A wide range of participants across multiple variables

  For both private and nonprofit sectors

  Audience from multiple cultures

  Across an organization, from individual employees through board of directors

  Beginners can make effective use of tool without a lot of training

  Advanced users can master tool with in-depth training

  Aesthetics

  Professional looking enough for corporate settings

  Not so fancy and expensive that it is unaffordable for other sectors

  Use of tool

  Works consistently and quickly

  Deepens conversations

  Helps people reach across differences and communicate more effectively

  Creates shared understanding

  Sparks new insights

  Engages creativity without making people anxious

  Testing assumptions

  As I learned through testing the tools in various ways, it’s impossible to avoid inserting personal bias into the process. While some of the assumptions I made were valid, people often surprised me. Extensive testing with a range of participants taught me what works best for the largest number of people.

  To remove as many personal biases as possible and design the most well-rounded product, I designed experiments that tested specific assumptions. By observing people as they worked with the test images and asking questions, I was able to get past some personal blind spots. That wasn’t enough, however. These tests had to be conducted by others to rule out any positive or negative influence I might be exerting as the creator. People have a tendency to want to offer supportive answers to the person responsible for coming up with the idea.

  The process of testing was a continuous dance of asking whether the changes being made got me closer to the design guidelines. If not, it was necessary to come up with new ways to test, along with the corresponding design changes.

  Bringing together a team

  At a certain point I realized I had gone as far as I could by myself. There was a need for a more formalized team that consisted of people with varying degrees of group and individual facilitation experience, as well as with people who had different subject matter expertise.

  To help me gain deeper insight, I added a business partner and a team of seven colleagues from a range of disciplines to help inform the development. The group included people with expertise in executive coaching, career coaching, intercultural communication, diversity and inclusion, and drug and alcohol counseling. They worked with me for a year to develop the tool and processes, as well as with their own clients and customers to give feedback.

  We developed a number of prototype VisualsSpeak ImageSets and watched how people used them for two years. The development team used them in sessions and reported back to us. As the process narrowed, we asked other professionals to try them in various settings.

  VisualsSpeak Development Team

  Testing, testing, and more testing

  By having a group focused on figuring out the
best ways to create this

  new product, the testing took on a new momentum. We tested everything we could think of. We tested the images for how well they elicited information and insights. We also tested the processes we asked participants to go through. If some part failed, we tossed it and moved onto the next.

  Everyone I came in contact with became part of the testing. Friends, family, colleagues, classmates, groups. I sought out people who were born in or lived in other countries, as well as people from diverse groups in the US. Because I value inclusion, it was critical that the product be usable by a broad swath of the population.

  Testing image categorization

  The first product I wanted to produce was designed for use in groups and ended up consisting of 200 images. This was determined to be an ideal number for the average size of groups that people worked with, which is approximately eight to 12 people.

  The problem became one of categorization. How could I create a set of images that would contain enough to be a visual communication system without being overwhelming to the participants?

  The reason I knew this was going to be a key issue is that when I began experimenting with the concept of facilitating with images, I started out by cutting out and laminating over 10,000 images from magazines and books. That’s a lot of pictures to wade through.

  When my development team was formed, we came up with a couple of test sets consisting of 4,000 images each, divided into 96 categories. We quickly realized that this was too unwieldy for both the facilitator and participants. The set weighed close to twenty pounds! It wasn’t going to be easily replicable, portable, or affordable.

  How categories are organized is not universal. If you ask a group of people to sort an identical set of items into piles that are similar, you probably won’t get two piles that are the same. Sorting is personal and influenced by the cultural lenses we bring to the task. In testing a category system that worked for the largest number of people, we separated pictures into category-labeled boxes without marking the individual images. When people thought a picture was in the wrong box, we let them move it. If the name we had chosen as a label didn’t make sense to someone, we asked them what would work better.

  There was a lot of trial and error. We kept charts of information about the frequency of use for the various categories. We used the wisdom of the crowd to finally identify four major categories with three subcategories each for the structure of the first product.

  Refining the deck

  Throughout the two years of testing, I was carefully watching and listening as people were interacting with the images. As a team, we were tracking the content of the pictures that people were selecting through the category system, and I watched the visual language elements: shapes on the page, whether the images were literal or metaphoric, the textures and patterns.

  I started to have a sense of how well an image would evoke the kind of conversation I was looking for. We reduced the number of images as I found the types that worked.

  Potential users told us what challenges they needed to solve. If you came within speaking, phone, or email distance, we asked questions and listened to what you said. Every detail of the tool was considered, reconsidered, and tested out. We learned that no matter what we did, it wasn’t going to be perfect, so we had to prioritize what was a must-have, as opposed to what was nice to have, and what we might someday have.

  Once we understood what was working in our prototype deck, we started to go out and photograph the types of images we would need for the final product. We spent most of a year taking over 20,000 photographs in order to create the 200 we eventually used for the VisualsSpeak ImageSet. We supplemented the photos my business partner and I shot with some photos our friends took in various locations globally we weren’t able to get to, and a colleague worked on a couple of shoots with us to give us a slightly different eye to add more diversity to the images.

  Refining the facilitation process

  Coming to understand the underlying visual language governing which images were most effective and the categorization of those images was only half of the challenge in creating a finished product. In order to write a user manual and train others in the best ways of using the ImageSet, I had to dive deep into the nuances of the facilitation process. What helped people get deeper insights and what prevented them from doing so? If my team and I could refine the process enough, then it could be used for a wide range of challenges.

  For example, there were two basic things that come to mind that would help facilitators to get better results or understand their audiences better. Counterintuitively, we found that giving participants less time than more to select their photographs gave them better results. Also, over time we began to see patterns emerge in how people laid out the photos they selected. Those who had more structured thinking styles laid out their cards in a structured arrangement. Those who were more big-picture oriented tended to be less constrained in their image arrangements.

  Through repeated testing and observation we were able to refine the facilitation process to a point that it became the basis or backbone of any exercise for which facilitators would use the product. It also became the underlying process for future products.

  What did I learn?

  The images we thought would work the best didn’t

  Images of metaphors and visual clichés did not deepen conversation like we thought they would. Instead, these images became a shorthand that kept the participant from having to dig deeper. An example is the picture of a lightbulb or half-full glass. When people see these, they assume they know what it means and the conversation stops because no one has to go deep into the image to draw new meaning from it.

  The spectacular photos from places like National Geographic didn’t evoke comment beyond praising the beauty of the photograph. Neither type of imagery reached our desired outcome of rich conversations and ideation.

  What we see is influenced by who we are

  It’s not possible to be fully neutral no matter how hard we try. Individual biases slipped into every aspect of designing the product from image selection to facilitation process. It took many eyes to even begin to counter the natural bias each of us carries. And even then, nothing is perfect.

  Size matters

  We knew from talking to people that having different size images was important. A large image says something different from what a small image says, and often would be used to convey relative importance. In order to meet the technical requirements of commercial printing, we had to make some hard choices as to which images would be printed in which size format. The realities of the final cost of the product also played a role in how we structured the ImageSet.

  The process is important

  At the beginning of all this I was totally focused on discovering which images yielded the best results. By the end of coming up with a product, it became apparent that the facilitation process is vital to how well the tools work.

  A successful product launch

  After years of research and work, the first tool was successfully launched. Over 10 years later it is being used by thousands of people around the globe in areas like:

  Team-building

  Leadership Development

  Strategic Visioning

  Coaching

  Intercultural Communication

  Therapy

  Education

  For me, it was like giving birth to a very special child. One that would help bring people closer together, deepen understanding, and be the spark for new insights long after I’m gone. It’s part of my legacy, my gift to the world. Even after all these years, I love talking to people who are getting great results with it. Those conversations continue to inspire me.

  Developing visual tools

  If you are interested in developing visual tools, here are s
ome things to consider:

  Is the tool for you or for others?

  If you create a tool for yourself, you can rely on your particular skills and experience. If you want others to use your tool, the skills required need to be broader or less specific.

  If the tool is for others, will it rely on specialized training? Do you want people to be able to use it quickly by reading a manual or learning through an online course? Will you only share the tool with people who have been through a certification program with you? Can they start getting simple outcomes right away and build mastery over time?

  Who are you creating the tool for?

  What age group? Do they live or work in a particular area? Do they work in certain industries? Will they use the tool personally or professionally? Will it be used in corporate, non-profit, government, education, or small business? Design consideration change for different audiences.

  What do you want the tool to do?

  What might it be used for? What might the user experience? What do you want participants to come away with after using your tool?

  What have you noticed that might be helpful?

  Is there something you have noticed working with visuals that shows promise? Do you have a resource that is being used for something else that could be repurposed?

  Do you want to create a printed or manufactured tool?

  Do you know how to produce a manufactured product? Do you know how to balance cost per unit with a realistic assessment of how many you can sell? Do you have graphic design skills? Do you know how to prep files for different kinds of printing? Do you own the correct rights to your images for manufacturing?

  If you don’t have these skills, consider finding people who do or who can help you learn. It’s much easier and lower cost to design for what is standard for the printer. If you design without these considerations, cost can be prohibitive. Design for manufacturing is a balancing act between what is possible and what is affordable. All of this will affect the final price you need to charge to be profitable. Can you sell at that price to your target market?

 

‹ Prev