I tend to be critical of what often passes for analysis (and, in many cases, even for reporting). My complaints tend to revolve around imagination and scope—I pretty much ignore technology (given that storage and compute power are, by and large, commodities) and methodology (because for the past five or so years I have been fortunate enough to work with some talented developers-cum-data scientists who have done outstanding jobs of translating ideas into analytic workflows and interactive dashboards in methodologically-sound ways).
In my opinion, asking a good analytic question requires five things:
A sense of your audience. This one is kind of odd insofar that you are dealing with two audiences: the internal audience (i.e., your managers, colleagues, and other stakeholders) that are likely to influence the questions you ask by virtue of their prior experiences and sense of the external audience and the external audience that you want to engage and inform with your analyses (and being your own audience for some endeavors is completely legitimate). The big questions here are what [topics] do they care about? and why [or under what conditions] might they care about the topic of your inquiry? You might not know the answer to either question; ironically, your intended audience might not either.
Imagination. If a question is easy to answer, it’s probably not all that interesting. For example, consider routine reports that are generated on a daily, weekly, monthly, quarterly, or annual basis: they contain insights that have been historically interesting and perhaps important to the organization. Increasingly, however, routine reports are something of button-pushing exercises: one needs to make sure the data is current and that an established process works but that might be about it. If you have access to more and different data or a flexible analytic platform, routine analyses might serve as a point of departure for more interesting ad hoc analyses and analytic thinking (e.g., why is that store outperforming other stores that are, for all intents and purposes, the same?). In many cases, good analytic questions are integrative: they link two or more trends—and the data that is . . . or could be . . . associated with those trends—together in ways that other people just aren’t thinking about. For me, imagination is fueled by looking at and thinking about different sources and types of information (ref. Isaiah Berlin’s “The Hedgehog and The Fox”).
Intellectual curiosity. If imagination is about framing an interesting analytic question, intellectual curiosity is a matter of how you respond to the unexpected: the question you set out to answer might not be the most interesting issue that you gain insight into. You might learn something about the issue, the data, or the tools. At the National Retail Federation’s Big Show this past January, I argued that Isaac Asimov’s quote on discovery (“The most exciting phrase to hear in science, the one that heralds new discoveries, is not 'Eureka!' but 'That's funny...‘”) was every bit as germane to analysis as it was to science. Throughout the analytic process, you should be learning about the strengths and limitations of data that you are using (and other data that you could use) and the tools you are using relative to the issue that you are trying to understand. You might find out that the question you set out to answer was really the catalyst for finding different, more interesting issues and questions to think about. Intellectual curiosity, in this case, often goes hand-in-hand with determination and perseverance.
Courage. Rarely is courage celebrated as an analytic tradition or virtue. It should be. Internal and external audiences both exert pressure—directly and indirectly—on a line of analytic inquiry based on their beliefs and assumptions. The degree to which they can exert influence over how an analyst conceives of, thinks about, or tries to answer a question varies greatly. The challenge for an experienced analyst is navigating those beliefs and assumptions while remaining true to their sense of what might be important and interesting. I often turn to Tacitus here: “…victory is claimed by all, failure to one alone.”
A willingness to learn as you go. Asking good analytic questions is inherently risky—the question and resultant analysis might fall apart for a myriad of reasons: the data you need to begin trying to answer a question might not be readily available (or in a usable form), the analytic methodologies that you think are needed to answer a question might experimental, etc. These are all learning opportunities. The challenge, in my experience, is that this learning occurs largely at the level of the individual and, depending on what their analytic process looks like, maybe the people who they worked with. The insights might passed on through coaching and mentoring relationships . . . or they might not. Very rarely (again in my experience) is time and effort put into documenting the process of trying to answer a good analytic question . . . or offering up insights as to why what seemed like a good analytic question could not be answered in light of the available data, tools, or resources (especially expertise).
Beyond these five things, it then becomes a matter of storytelling (and the review and editing processes associated with storytelling), which is its own thing in terms of necessary skills as well as individual and corporate processes. For that, I recommend taking a look at the thinking of authors like Neil Gaiman or the work of people like Nancy Duarte (“Resonate”).