The crowded conference room, filled with dark suits, looked at me blankly. Somewhere along the way, I had lost my audience. It was not until I went back a slide that I was able to pinpoint the culprit.

“We have identified that analyzing the current data involves managing the bias-variance tradeoff.”

I rewound my explanation, stripping away the jargon. I broke it down: why we had to strike a balance between accuracy and consistency when working with that specific data.

Closing the gap between data scientists and stakeholders is rarely straightforward. It requires a strategic plan and precise execution. The “last mile” is an issue that affects most technical professionals: doing great work isn’t enough if it can’t be communicated in a way that drives decisions.

When you research the problem, the answers are always the same: active listening, be clear, show empathy, be receptive to feedback, and so on. Groundbreaking stuff.

But what do you do when the problem is none of that? What if the very person requesting to work has no idea of what is involved and is unwilling to listen? Or worse, what if they think they know about your job, but their “expertise” is far from reality?

Communicating in Difficult Situations

In one particular project I worked on, the demands were very high. The stakeholders claimed to be very familiar with the intricacies of the data. During the planning meeting, I was given the scope of work and told exactly where and what to look for.

The first week of the project, I was puzzled. The data, claimed to be curated, was raw as it came. The exploratory data analysis (EDA) took me an entire week. There was no naming convention. The compound keys were poorly used in all the tables, and data integrity was nonexistent.

On our first touch-point meeting, I was scolded for using an entire week to understand the data. This was the first time I was faced with combative stakeholders. They wanted me to analyze the data, but in their way. To make it worse, one or both of them were suffering from the Dunning-Kruger effect.

“If I do it the way you are suggesting,” I explained, “the results will be unreliable.”

They immediately brushed me off. The pressure to complete the project began to mount.

Calmly, I explained that I needed to be familiar with the data before applying any statistical formulas. I demonstrated some issues. For instance, one column with numbers was not registered as a string.

“Just make them numbers.” The stakeholder who claimed to be familiar with the data snapped.

Again, I explained this was not as simple. Converting the data would cause multiple null values because certain characters wouldn’t be recognized ,, ., -, and so on. The problem was much bigger than they had estimated. Still, the pressure to move forward at a faster pace was on.

Understanding the Ununderstandable

Communication is a subject that I have been passionate about for many years. My graduate degree is in Strategic Communication. But no degree would have prepared me for some of the situations I’ve faced.

The stakeholders’ requests were outlandish. They wanted me to use the data as-is, without any EDA. Every algorithm I tried was overfitting or underfitting. The F1 scores shifted with each run. I even rechecked if my SQL pulls to ensure they were not partial. For a moment, I began to doubt myself.

By the second meeting, when the scope of work had completely changed, it became clear to me that the problem extended way beyond the data. They ended the meeting with, “Start with that and let’s see where it goes.”

No! That’s not how a task is supposed to be. It has to have clear guidelines and clear goals. I was being set up for failure.

So how do you complete a task when not even the taskmaster knows what they want? How do you understand the ununderstandable?

Becoming Proactive

This project was like nothing I had ever experienced. For the most part, I would listen to the request, take notes, and ask a few questions to ensure we were all on the same page. Within the timeline established, I would present a proof of concept, outlining everything the data could and couldn’t do. Most projects were straightforward with no major issues.

But this was no ordinary project. If I wanted a positive outcome, I’d have to be very proactive.

The data was full of surprises, and I began to turn those into collaborative explorations. When faced with unstable pulls or problems with the data integrity, I would defer to the stakeholder who said he understood the data in and out.

Often, his solution was less than perfect. He would suggest using a different table or promise to get back to me at a later time. That would be communicated in a team chat.

During the meetings, I would take notes and share them with everyone, providing clear guidelines of who was responsible for what. The agile board would be updated accordingly on a daily basis with comments on every decision we made. I even updated tickets that were not assigned to me.

Being proactive helped my teammates and I deliver a project that appeared to be, at first, undeliverable.

Conclusion

Closing the gap in the last mile is not always as simple as active listening or asking the right questions. It requires a mix of strategies depending on the type of personalities you are working with.

The one thing I’ve learned that always works every time is allowing the stakeholders to be involved as much as they want to be involved. If you are working with folks who are hands-off, allow them to be hands-off. Conversely, if someone wants to be involved, you might not have another option.