In this episode Tristan Keelan, the Director of Business Development for CCNY, talks about how they evaluate data for non-profits. Tristan shares how listening to clients and asking the right questions allows him to provide insights that help these companies improve their communities.
(upbeat music) - [Announcer] This is
"Buffalo State Data Talk." The podcast where we introduce
you to how data is used and explore careers that involve data. - Hello, and welcome
back to another episode
of "Buffalo State Data Talk." I'm your host, Heather Campbell. And thank you for joining
us for Episode 14. Today, we'll be talking to Tristin Keelan, the Director of Business Development
for Community Connections
of New York or CCNY, a nonprofit management
services organization, and host of the "Data Doesn't
Equal Outcomes" podcast. Welcome to the show, Tristan. - Thanks for having me, Heather.
It's nice to be here. - Could you start us off by
telling us a bit about CCNY, and what you do for the company? - Sure. So CCNY, we're a data and
And what we do is we help other nonprofits really assess the outcomes that their programs are producing. And as the director of this development, I work with those nonprofits
to really match them
with the right solution. And I say right solution because there's more than one
way to evaluate a program. And very often what an organization wants or thinks they need is not
the same as what they do need.
So we, kind of, need to have
that back and forth to really, you know, get that,
kind of, that root need. And so I act as that liaison
between our data analysts, our program evaluators, and the clients to really
sort of get that mapping.
- Excellent. And so could you tell me a
little bit about, you know, a typical day or what a
typical week looks like for you in your position? - Sure.
So I spend a lot of time
talking with potential clients. And it's not, you know, somebody who needs a program evaluation, it's not like buying off the rack. You can't just walk in,
buy your program evaluation
and walk out the door, right? So most of my time is spent, you know, a lot of back and forth. Usually there's multiple
calls with a client to really, ask those critical questions,
and really make sure that what we agree to deliver for them is in
fact what they need delivered. - Yeah, it sounds like you spend a lot of your time asking questions, and more questions and
more questions. (laughs)
- Very, very much so, yeah. - So going back a little
bit to the actual data that your team uses to do this analysis, how are you storing this data? Are you storing it on the cloud,
or is there a specific program you use? - Yeah, so we typically,
in terms of storage, where does it go, we like to work on our
client's environment. So rather than, sort
of, taking their data,
we create ways for them to give us access to their environment. Whether it's their cloud or
whether it's their local server, because that way we can know
that when we're finished the work that we're leaving
them will in fact work
on their systems, and their
hardware and their software. If we do it, you know, on our end and then try to pass it over, any number of things can go wrong. Oh, that system's not compatible
with the hardware that we have. You know, we would have run into that, and encountered that and
corrected that had we known. So we prefer to have to work on our client's environment in that way.
- That makes a ton of sense. I mean, technology can be a huge barrier and often people don't necessarily want to learn a whole new program. So if you're already using
something that they have,
that makes a lot of sense. - I think what's underestimated lately is, not all the hardware out
there is created equal, especially when it comes to
data and large data sets. You know, there's programs
that might run really smoothly
on your i7 processor that
when you hand it over to your client on an i3 processor,
they can't even open it. And if you haven't like
negotiated those things, you can take a lot of really good work and find it unusable.
Or you force your client to now buy a new computer when maybe that could have been avoided. So we use SQL Server from Microsoft. You'd be surprised, or
maybe not surprised,
how much data just still
navigates through Excel, you know. But we can work in Tableau. We prefer to work in Microsoft Power BI as our data visualization tool. It seems to be the most
in sort of accessible
for our clients. But really Excel is still
the pivot point for data. You're either going from Systems, to Excel to Power BI very often. You know, if we're lucky
we can use API Integrations
to connect directly to
systems, but not always. - Yeah. I know that some data scientists think that we shouldn't use Excel, Excel is not a data science product,
and some people absolutely
love using Excel for data science. So it's interesting to hear that a lot of people are still
using that for their data. - I think that the visualization
tools like Power BI
and like Tableau and those have really reminded us that, at its core, Excel a database. Before you color code anything, before you, you know, for a long time,
it's been easy to use Excel
for its display functionality. You know, merge a couple cells so that you can put a header on it. Those kinds of things were very popular until now we're visualizing things
in, you know, actually these really, really fantastic visualization tools, but they require that if
you pull data in from Excel, that it's formatted like a
database and nothing else. So I would say Excel's
become even more important,
but it's playing a slightly different role because there's less pivot table happening because that's happening
over in the other systems. There's less display color
coding type functionality, 'cause that's happening
in the other systems.
But as a conduit, as a common mechanism to get from whatever
software you're using, there's almost always an
export to Excel feature, that's really the most common. I mean, we can pretend like it's not,
there's some huge, huge companies where you can directly connect, but most people aren't using huge company. I mean, most people are using softwares that don't connect directly,
so you need to pass it to Excel. So I think it's almost
more important than ever. - One of the things that
you mentioned talking about, you know, having your Excel be a database, and not having it color-coded,
and merged cells and everything, I'm assuming that's a little bit part of cleaning up the data
that you need to do. So once you have access to the data, and you've done all of that cleaning
that you need to be able to analyze it, what happens next? How is the data actually
being used and analyzed? - When I talk about cleaning up the data, just an example,
if your Excel sheet
has a column of numbers and if they have a total, right, if they total up at the end, well, your visualization
tool is gonna add them all up and add your total.
So you need to get that
total out of there. We work on something called Utilization Focused Evaluation, and it's really easy, especially for data scientists,
to just get too big and too deep. And our model of utilization focus says, if your client can't use it at the end, then it shouldn't have been done. And so very often,
you wanna build something
that's really awesome, but it could be, you know, and I don't mean to downplay
the intelligence of our clients 'cause that's not it at all, what it is they don't live
in breeze, this stuff, right?
And they don't need data
and awesome dashboards. They need answers to questions, and they need to solve problems, right? The data scientists can get too hung up in, "Look at this, you can click here,
it filters over here, it does all this stuff. And you can answer this question." Well, they only need to
answer this question, so what we need to do is
give them that answer.
And so really sort of
taking that step back and making sure that you're
working with your client to deliver something
that can be understood, and can be used and almost has
a planned action behind it. Like that client should
know if the metric dips
below a certain amount,
we know what to do. We're ready to go and ready to intervene. So really building is
those final deliverables with that utility in minded. - So you perfectly led
me into my next question,
which was the deliverables. You already mentioned dashboards, but what is it that you typically
are giving to your clients in their dashboards, written
(indistinct) reports, papers, presentations?
- Sure. All of the above. But, you know, you mentioned dashboards. And I think that, that one of the parts of what I get to do that's really important is everybody
usually tells us what
they need is a dashboard. I need a dashboard that shows me this. And it's because that's the
way we're sort of being trained to think in visual analytics,
everything's a dashboard. And when it got down to it, you know,
they didn't actually need a dashboard. They had to submit raw
numbers to a reporting entity. So every month they
were required to submit. So a visual dashboard
actually doesn't help them. It only only forces them
to find those digits
and manually enter them somewhere else. So, you know, we were
able to uncover that, you know, hey, lo and behold,
to solve this problem, you actually need an Excel
output that has those numbers that you can upload
instead of a dashboard,
so that you can manually find them and then manually upload them. - So could you talk about a
specific challenge that you, or your company or your
team had to overcome, and how did you solve it and, you know,
maybe what went wrong along the way and how did you fix those things? - Yeah, so in a previous life, I worked at a social services provider and I was working with a
preventative services program
and they had this goal
bestowed upon them, right? By the county to meet all new clients, refer them to the program within 24 hours. And it's really hard to do, you know, you can call people,
they don't have to answer, they don't have to call you back. You know, it's one of those metrics that the average person says, "Well, we'll report on it,
but there's nothing we can do
about it to make it better." And so what we were instituting
was an assessment tool for this program. It was already in place, but we weren't really collecting
the data very well, though.
It's called the North
Carolina Family Assessment or the NCFAS, is more
commonly referred to. And it measures how well a family's doing. And we would measure it pre and post, but it was being written down on paper
and just kind of filed away. So I recommended, let's
do a pilot project. I said, "Let's take one
caseworker, just one, and let's collect their data. Let's get it manually entered in to Excel.
Let's get the relationship built
to the rest of the dataset, and let's just see, you know. Give me one person for a
month collecting these things. And I wanna take a look
and see what we can do." And the results of that were,
you know, we measure the Delta, right? So the mathematical
calculation is for each case, what is the pre-test
score, post post-test score and the difference in between is what we're gonna
call our outcome measure.
And we've looked at that in aggregate for this one caseworker over
the course of our test period. And what we're able to
do is take that Delta and you can filter it by a
whole bunch of other criteria. So does that Delta get better
or worse when other variables are applied? And so the first one that we applied was, was the client seen within 24 hours. And it was amazing
clients that were seeing within in 24 hours had
almost double those that were
not seen within 24 hours. So all of a sudden we went
back to the caseworkers and said, "Okay, well, two things. First, we're gonna collect
this data for everybody because it tells us a lot.
And now we can, you know," my tenure didn't get far enough, "but we're gonna look at other ways to collect it besides paper." Right?
The second piece is look at the difference in your case outcomes
when you achieve this seemingly arbitrary metric upfront, and it's not arbitrary, it's about engagement and
clients who get engaged quickly.
It's almost like a hook. Like you brought them into the program, you get that buy-in right away, and all of a sudden the
attitude changes from, you know, "Oh, that's not
something we can impact,"
to "Well, we have to
find a way to impact that because look at how
much impact it's having on our final case results." - It's really great that
you guys were able to go in and show the reason behind
why these people needed
to connect with these people in 24 hours. I mean, if somebody told me, "Oh, you need to do it, I'll be like, "Okay, sure, all right, I'll try." And it doesn't work, and you're like,
"Okay, well, I don't really care because, you know, whatever." But then somebody comes in and says, "This is important because
it's actually leading to benefits to your clients."
Then you have a reason to do it. You have, you know, I think
that's really, really powerful. And it's cool that you're able
to show that with the data. - I think what we saw
is the mindset shifted from, "Well, I called them
and I called them again,
and nobody called me back." Then you started hearing things like, "No, I call once, I
call again in two hours, I call again in five hours, and if not, I drive to the house," right?
Like you star start hearing, you know, things like I... When I get a new case, it's becoming a priority. - So I wanted to talk to
you about something else
that you mentioned when
we talked previously, that was important to you. And you mentioned it very briefly earlier, and it was also a topic of one of your previous podcast episodes,
trauma-informed data. So could you tell us what
is trauma-informed data, and why is it important to you? - The principle comes from
a therapeutic methodology with trauma-informed care.
It's the assumption that
all of us have gone through some kind of trauma, some level of trauma. But if we all approach our
interactions with people that we come across with the assumption that we don't know what
that person's gone through,
and it could have been
something significant, then we tend to treat those
people in that way, like. And it's very important in the therapeutic
relationship to understand that people's past are
contributing very much
to their present. In the data realm, I
discovered very quickly that data is traumatizing to people, and especially when it
measures their performance. And the way that I've come to
describe this is when we're
in school, a traditional K through 12, post-college type of an education, we're given grades on assignments. We're given grades on, you know, classes and we're
given final GPAs, right?
So our experience with scoring mechanisms, and this is goes across
the board for everybody, is they're presented as final. When you're done with the math test, you get a score,
you don't take that particular test again. Now, when we get into the working world, the pivot on this concept is substantial. Instead of having new tests and new course material all the time,
the measurements tend to stay static. If you're in a mental health clinic, you measure no-show rate constantly, the measure doesn't change. It's number of appointments
by total number of
appointments scheduled, right? And the measure won't change, but you keep measuring it week over week, month over month, year over year,
trying to either improve it
or keep it at a certain level. And we're simply not prepared for that type of scoring mechanism. And what happens when we do
that is it's very threatening to have your performance measured
because we're so conditioned to that's it, that's the stamp on our performance. And really we need to
shift our focus from data as a finality to data as
a means to improvement, data as insight that we
can try to get better at.
I think that data scientists
have an immense amount of power, but also an immense amount of responsibility when
communicating numbers and measures with their stakeholders. You almost have to become
the therapist for them.
"Hey, I'm about to show you something. You're not gonna like it, and that's okay." I think that any industry can
benefit from data scientists to shake that mindset, and remind people
that we measure data
for quality improvement instead of measuring data for
getting people in trouble. - Do you have any suggestions of some resources or
anything that you could share if anybody was more
interested in this topic?
- CCNY is actually publishing an e-book that will be available in December, that maps all the principles
of trauma-informed care to how they can be
applied in a data setting. So we will, very shortly,
have a comprehensive e-book
on the topic, so. - Excellent. And we can link that in the
description of the episode too. - Sure, yep. So if somebody was interested
in working in data,
do you have any suggestions of, you know, what kind
of training or education that they would need? - The number one
recommendation I would have is learn your tools with data that you know.
And here's what I mean by that. I never, I'ma use Power BI as an example, I didn't learn Power BI
using healthcare data because I didn't know healthcare data. I would use sales and marketing data.
And that's how I learned the tool. Because when I made a report, I knew my data and I knew it could pass or not pass my own smell test, right? Like, I would know, oh
no, that's not right.
I didn't get that formula right. And I could keep working on it. - I love that advice. I think that's excellent. Because as you were saying that,
that intuition of whether you
got some things wrong or not, you're not gonna have on data
that you don't understand. And it's really hard to
build that intuition working on something that you
don't know anything about. So I think that's excellent advice.
So switching things up just a little bit, are you able to set aside time
for professional development? And if so, like what kind
of activities do you do? - Believe it or not, I'm finding LinkedIn to be
the best place to learn.
You know, following the right people, following the right hashtags, and, you know, for all on our phones, on social media, anywhere. I guess doing it in a place
where people are sharing advice
and tips and, you know, that's where I see all the
Power BI product updates end up coming through there. So I found that the LinkedIn community has been a really great place to learn.
- So you are the host of the CCNY, "Data Doesn't Equal Outcomes" podcast. So could you tell us very briefly about what is on your podcast? - The Genesis of that title is,
you know, for the social
services industry that we serve, there's still a lot of people
trying to get to their data or create data in the first place. Now, the common phrase that I've heard is "I
need data and outcomes."
And that's just stuck with
me for the longest time because they're not the same. And they can't belong together. Data is just numbers, it's just what happened,
it's just some, you know, some numerical definition of the past. An outcome is a betterment. It's something that, you know, just because it can be
mathematically shown doesn't mean
it doesn't fall into that data. It means more than a data point. An outcome means somebody got better. An outcome means somebody's
not sick anymore. We need to make sure
that we recognize that
just because we have data doesn't
mean we're doing the best. It doesn't mean we're
producing the right outcomes. And so our podcast is really
just about all those things that have to happen in
order for outcomes to exist, besides the data.
And so we like to tell stories about, you know, what are the interventions, what did you do because of the data. - Well, I highly recommend
our listeners check out the "Data Doesn't Equal Outcomes" podcast.
I've listened to a couple of episodes, and they're really interesting. And your most recent episode
had a very exciting guest, one of the Buffalo State
Data Science graduates, Muhammad Hawk.
And we'll link the episode
in the description. - Absolutely. - So Tristan, thank you so
much for joining us today. - Thank you for having me. It was wonderful.
- And to all of our listeners, if you haven't already, check
out our previous podcasts. They're available wherever
you listen to podcasts. For more information
about starting your career as a data scientist,
go to dataanalytics.buffalostate.edu. And don't forget to subscribe so that you get a notification each time we release a new episode of "Buffalo State Data Talk."
Back to Top
Some content on this page is saved in PDF format. To view these files, download Adobe Acrobat Reader free. If you are having trouble reading a document, request an accessible copy of the PDF or Word Document.