5 habits to turn AI into a thinking partner

AI is the new baseline. But how can you move beyond it?

AI tools like Copilot are rapidly becoming embedded in everyday workflows. Most teams are already experimenting with GenAI; access to these tools no longer automatically translates into better outcomes or gaining a competitive edge – you need to master it.

In a recent webinar on ‘Beyond Copilot – Leveraging Agentic AI to Enhance Your Decision-Making Skills’, AI expert John Ennis (Aigora) and our very own Chief Research Officer, Greg Stucky, explored how professionals can upgrade from basic usage to more effective collaboration with AI. Their advice focused less on “magic prompts” and more on practical habits that unlock real value.

Here are five key takeaways.

1. Change your mindset when approaching AI

The core takeaway from the session is foundational to how we approach using AI: it’s about your mindset. AI shouldn’t just be used for producing outputs per your command; instead, it should help you expand your thinking and understanding – ask it to explore topics, uncover connections, or challenge your current thinking.

Copilot, for example, can be particularly powerful for discovery and retrieval. It can quickly search large collections of research, documents, or past files to help users surface relevant insights they might otherwise not consider.

The goal is to learn more about a problem and explore new ideas – shifting the mindset should be to collaborate with AI as a think-partner.

2. Ask exploration questions instead of direct commands

While a change in mindset is important, practical techniques can also be a simple way to enhance your AI usage. A simple tip from the webinar recommends starting requests with “How might I…

For example:

How might I approach this research problem?

How might I structure this analysis?

How might I explore this topic further?

This framing opens the floor for multiple approaches and perspectives, helping users break out of their own assumptions and biases.

 

3. Practice “model empathy”

Another key concept introduced in the webinar was model empathy.

Large language models (LLM) are extremely capable, however, it doesn’t know the context you are working with and common sense, which might seem straightforward to humans. To work effectively with them, users must think about the problem from the model’s perspective.

This means asking yourself questions like:

What information does the model have access to?

What context does it need to understand the task?

What details am I assuming that the model doesn’t know?

Providing that context helps the AI produce far more useful responses, tailored to your situation and needs.

 

4. Be clear about what you want and mean

Another important aspect of prompting that may be overlooked is clarifying definitions and providing structure. Similarly to context and common sense, the model doesn’t know all the information you have about what it’s been tasked to do.

For example, terms such as “attribute”, “benefit” or “insight” may have different meanings depending on context. If the model doesn’t know how you’re defining them, the output may not align with what you need.

Giving the model definitions, templates, or frameworks cannot just help guide its responses, but also put guardrails in its thinking to ultimately produce more consistent outputs that serve your needs.

 

5. Move beyond prompts – Think systems that serve your needs

Prompting is only the starting point…

To truly unlock AI’s potential, you should begin thinking about systems, workflows and specialised tools that are tailored to your specialty needs, rather than relying on a single general-purpose model.

You’ll see the biggest gains when AI becomes more than just a tool for mundane tasks. When it gets integrated into existing workflows, all processes for learning, decision-making, and problem-solving become more multi-dimensional and connected.

The real shift – what does this mean for insights professionals?

Ultimately, for insights and market research, the implications go beyond using AI tools for mundane tasks. They are most valuable when they are used as thinking-partners.

When approached this way, the opportunity for insights professionals is not simply efficiency. It is the ability to augment their thinking: Expand the range of questions they ask, the connections they explore, and the stories they uncover in the data.

The challenge isn’t simply learning how to prompt better; it’s learning how AI thinks and being able to work alongside AI more intelligently. And in a field where the real value lies in interpreting complexity and revealing what others might miss, that shift could prove far more powerful than automation alone.