How to Develop the Five Soft Skills That Will Make You a Great Analyst How to Develop the Five Soft Skills That Will Make You a Great Analyst
Analysts often assume that the best way to advance their careers or provide more value is to improve their technical chops.... How to Develop the Five Soft Skills That Will Make You a Great Analyst

Analysts often assume that the best way to advance their careers or provide more value is to improve their technical chops. While it’s true that you need a baseline of technical skills to conduct analysis, most of us need further development of our soft skills rather than further honing of our hard skills.

Soft skills tend to be harder to learn. That’s because, unlike technical skills—which you have to improve when you face a new challenge that your existing skillset can’t solve— soft skills require conscious practice.

Analysts who work to improve their technical skills at the expense of developing soft skills misunderstand a key part of their contribution to an organization. And ironically, that means they risk losing out on opportunities for career advancement. Soft skills are what allow you to bring strategic value to your team, and make sure your work solves actual business problems. Without them, you risk relegation to the role of “order taker” for the rest of your organization.

This post is about developing your soft skills in the real world. To be a great analyst, you’ll need to practice these five things:

Translate effectively

Analysts and non-analysts typically use different language to refer to the same things. Questions tend to get asked in ways that either lead in the wrong direction or require unpacking. For instance, a seemingly simple question, like “How much money are we going to make this quarter?” needs to be translated in a few ways. First, what does “money” mean in this context? (It could be ARR, cash, etc.) Second, how can that measurement be accurately translated to a query?

Figuring out the underlying question, finding a proxy for that question in the form of a SQL query, and communicating that assessment back to the person who asked it is easier said than done. Some of the top organizations in the world at assessing company performance say that the analytics translation problem is so difficult and important that it demands full-time headcount. The Analytics Translator role would “bridge the technical expertise of data engineers and data scientists with the operational expertise of marketing, supply chain, manufacturing, risk, and other frontline managers.”

A full-time analytics translator may be overkill for all but the largest of companies. It’s almost certainly not practical for most, so the implication for those who work with data is that they’ll need to bring that skill to the table themselves. Without it, it’s likely the wrong question will be answered, thanks to misinterpretation at the outset of an initiative. And, as the analytics and data science work within an organization gets more sophisticated, it will get ever more difficult to communicate outcomes.

Evaluate yourself

Think about your general approach to fielding questions:

– How often do you get the feeling that the people for whom you’re creating analysis aren’t fully describing their problems?

– What tactics do you use to draw more precise information out of them?

– Ask yourself: “If I answer this question exactly as pitched, what action will this person take as a result? Is it different from the action they would take anyway?”

Resources to help you improve

1. Linda Powell’s presentation at OSCon, “Talking Data with Non-Data People,” explains how to translate effectively, with tactics and examples.

2. In an article for Harvard Business Review, authors from McKinsey explain their view of the role of analytics translation and the skills needed to do it effectively.

3. In “The Translation Layer: The Role of Analytic Talent”, a white paper for SAS, Lori C. Bieda explores the the role that analytics translation plays in bridging the gap between data and the conversations where business decisions are made.

Ask good follow-up questions

Asking an analyst a question is easy. Asking a really good question, whose answer can be immediately turned into action, is hard. It’s almost never the case that the answer to the first question in a conversation is the true source of a problem. So to get to the crux of an issue, analysts need to ask clarifying questions.

Consider a question like “How much traffic did this blog post get last week?” The person asking this question is most likely making some assumptions — for instance, that traffic is a good indicator of which blog posts perform well.

But for many businesses, this isn’t actually a good indicator of success. Without asking follow-up questions to try to get to the heart of the issue, analysts can’t solve the real problem. Or, solving the problem might take many iterations. The initial question gets answered, but then the questioner realizes the answer doesn’t actually solve their problem, and so they ask another question, and so on.

Navigating this back and forth to make sure the right question gets answers takes patience and excellent listening skills. While the back and forth can sometimes feel like wasted time, it’s far better to spend the time up front before actual technical work is done, to ensure the outcome of that analysis has the desired impact.

Evaluate yourself

Think about the last piece of work you delivered:

– Did the initial analysis solve a problem for the person who asked a question?

– Did you get follow-up questions that you could have predicted or sussed out before actually starting your analysis?

– How many times did you have to return to your query or conduct further analysis to address additional questions?

Resources to help you improve

1. Modern Analyst’s content series Getting Back to Basics is a good primer on how to determine requirements before you begin analytical work. The first article in the series, Asking the Right Questions, is a great foundation on which to build your skills.

2. Mike Savory, who works as a product manager, authored a worthwhile post on the three questions he asks in product research, and why. Though the questions are positioned for product managers, the thought processes behind them are important considerations for analysts to make as well when exploring a problem.

3. The 5 whys is a great framework for iterative problem-solving.

Determine where information has been conveyed incorrectly or incompletely

By the time a question makes it to an analyst, it has likely gone through a veritable game of telephone. Analysts need to be able to identify whether the information they’ve heard is accurate in order to conduct an accurate analysis.

This has a lot to do with understanding the context of the problem being solved. If, for instance, an analyst doesn’t have much visibility into their organization’s marketing spending patterns and how they fluctuate seasonally, it’s next to impossible for them to provide useful insights about where and how to maximize efficiency. Say a marketing manager on the team wants to know how much money to allocate to various campaigns this quarter; it’s important that the discussion acknowledges how performance in Q4 tends to differ from other quarters, as holiday shopping tends to drive ad prices up.

As an analyst, you have to understand that you have a blindspot. It’s your job to minimize that blind spot as much as possible. Seek out opportunities to get hands-on experience with your problem, if you can.

Evaluate yourself

Think about the last time you received a request for a report or an analysis:

– Did you seek out the person who initially raised the question to learn their desired outcome?

– Did you know the day-to-day responsibilities of the teammates who would use your analysis?

– Did you take the time to familiarize yourself intimately with the work that your analysis was meant to impact?

Resources to help you improve

1. A classic, simple, but surprisingly effective method of determining whether you understand a problem thoroughly: can you explain it to a rubber duck?

2. Stephanie Famuyide recommends building a context map to define the scope of a project. In doing so, you can identify places where you may have incomplete or incorrect information about a problem.

3. The second installation of Modern Analyst’s Getting Back to Basics series, Requirement Techniques, includes a guide for getting the context for an analytics project you need at the source, by interviewing, shadowing and facilitating.

Communicate your results

Even the most sophisticated and interesting analysis will be useless if its audience cannot understand it, or if it isn’t communicated it in a way that prompts the audience to take action. This means one of the most important skills an analyst can have is the ability to communicate their conclusions so that they can make it back through the telephone chain intact.

Clearly and concisely communicating results is challenging, because results are rarely cut-and-dry. There are often caveats that make it difficult to provide a perfectly clear answer. It can be tempting to explain all the twists and caveats of an analysis when presenting results, especially if it was a particularly interesting problem to work on. But for the most part, the people who make decisions based on that analysis aren’t interested in caveats. Their priority is not understanding every facet of the analysis, but rather knowing enough to move forward with confidence in their decision. Analysis can be presented in a way that gives the audience the opportunity to decide for themselves which details they want to dig into, without creating distraction or confusion.

Evaluate yourself

Think about the last time you communicated the outcome of your analysis:

– What actions are your audience likely to take based on the details you’ve shared?

– Would those actions be in line with your recommendation? If not, is there additional context you could include to make the explanation more clear?

– Maybe more importantly, are there details you can leave out to make the explanation more focused?

Resources to help you improve

1. Chris Westfall has a useful and quick video on the importance of delivering context with data, “How to Communicate Data & Bring Numbers to Life,” which explains how an example can go a long way.

2. Megan Risdal has an excellent guide on Kaggle’s blog, “Communicating data science: A guide to presenting your work,” with four important values to consider: quantity, quality, relation and manner.

3. Jeffrey Keisler and Patrick Noonan, professors from UMass Boston and Emory University, published a deep dive tutorial on communicating data specifically for the sake of decision making: “Communicating analytic results: A tutorial for decision consultants.”

4. Benn Stancil, chief analyst here at Mode, has written about the importance of delivering analysis (which includes story-telling), not just charts.

Explain and document

Repeated work can be an enormous time-waster for analytics teams. Written explanation and documentation of analyses helps not only the audiences that analysts serve, but other analysts in the organization.

Analytics work can be accretive if the work is done to document it. But without proper explanation, documentation and organization, a team’s current and future analysts will inevitably reinvent one wheel or another. With excellent documentation, analysts can build on their work and create more advanced and valuable answers for their organizations over time.

Evaluate yourself

Think about your work habits in general:

– Do you find yourself writing the same queries over and over to get started?

– If you were to leave your role today, would a new analyst be able to pick up where you left off?

– Would a non-analyst be able to look at reports you’ve built and know what they are for?

– Do you do anything in your analysis that’s unconventional or might raise eyebrows? If so, do you make it clear why?

Resources to help you improve

1. Airbnb’s Engineering & Data Science team has shared the processes behind their Knowledge Repo, which has helped them create a scalable and democratized data culture.

2. Roger Peng, a professor at Johns Hopkins University, published the “Reproducible Research Checklist,” which provides easy-to-follow Dos and Don’ts for making sure your work can be, as the title would suggest, reproduced.

3. Kaggle’s Winning Model Documentation Guidelines list the information required of their competition winners in order for their data models to be accepted. This list provides an excellent example of the level of detail to include when documenting your own work for colleagues.

4. One of the best formats for documenting your work dates back over a century; the lab notebook. These notebooks inspired the modern data science notebook, the use of which we have written about ourselves: “The (Data Science) Notebook: A Love Story


Originally posted here.

Derek Steer

Derek Steer is the CEO and co-founder of Mode. Prior to co-founding Mode in 2013, Derek was an early member of Yammer's Analytics team. There, he led sales and monetization analytics, drawing upon his experience on the monetization analytics team at Facebook and his background in antitrust economics. Derek's passion is training the next generation of analysts. He is the author of SQL School and a mentor at Insight Data Science. Outside the office, you can find him biking up Mt. Tam, checking out the latest exhibit at SFMOMA, or stuffing his face at the local taqueria.