top of page

Humanizing AI and the Power of Shared Stories

  • Feb 12
  • 15 min read
A hand touches a glowing heart rate line in a busy market. Blurred people, stalls, and a moody, dimly lit atmosphere surround the scene.
Image Source: AI-generated via Canva

In this episode of Signal Shift, Raakhee sits down with Andy Sitison, a tech executive and the founder of Elemental Advising to talk about humanizing AI. The conversation opens with a stark warning from cognitive neuroscientist Dr. Jared Cooney Horvath, who recently testified that digital dependency is making younger generations less cognitively capable than their predecessors. Andy shares how his work with Share More Stories uses AI not to replace humans, but to deep-dive into the "shared wisdom" of communities by analyzing emotive and attitudinal data.


They discuss the "Digital Veil"—the disorientation of a world filled with AI-generated "slop"—and the urgent need for civic structures to catch up with the massive growth. Andy leaves us with a powerful call to action: to combat the modern loneliness crisis, we must "get local" and re-establish the physical wisdom and human connections that technology cannot replicate.





The Cognitive Impact of Technology

The conversation opens with a reflection on the recent US Senate testimony by neuroscientist Dr. Jared Cooney Horvath.

  • Cognitive Decline: Dr. Horvath testified that Gen Z is the first generation to score lower in core cognitive measures (literacy, numeracy, and overall IQ) than their parents at the same age.

  • Learning from Humans: He argued that humans are biologically programmed to learn from other humans and deep study, whereas digital tools in classrooms fragment attention and train students to be "skimmers".

  • The Educational "Problem": This decline is noted in over 80 countries; once digital technology is widely adopted in schools, academic performance typically drops significantly.


Andy Sitison: Humanizing AI

Andy Sitison, founder of Elemental Advising and CTO of Share More Stories, shares how he uses AI to analyze community narratives to uncover emotional insights like joy, anxiety, and trust.

  • Trust and Authenticity: A key finding from his work is that trust and authenticity "travel together"; when people trust a process, they are more authentic in what they share.

  • Humanizing Data: Andy argues that AI is best used when it pulls forward the needs of a community to help leaders make better products or policies, rather than just chasing profit.



The "Trough of Disillusionment"

The guests discuss the current state of AI adoption, which Gartner has identified as entering the "Trough of Disillusionment".

  • Maturity Gap: Despite rapid adoption, the current application of AI remains immature.

  • Market Share vs. Humanity: Many AI features appearing on devices today are designed to win market share rather than improve human life.

  • Civic Infrastructure: Massive growth in data centers (particularly in Virginia) is outpacing civic structures, with current laws often failing to distinguish between the resource needs of a standard home and a football-field-sized data center.


Reclaiming Human Agency

The episode concludes with a call to resist the "digital veil" that can make the world feel disorienting and fake.

  • Physical Wisdom: Andy notes that younger generations are gaining cognitive speed but losing "physical wisdom," such as the ability to work on machinery or solve physical problems.

  • Local Grounding: He recommends a "one hour a week" (or day) commitment to local, human-to-human activity—such as volunteering at a food bank, joining a bike group, or teaching someone to read—to re-establish grounding and connection.


*Disclaimer: The text in this post is AI-generated from an original video podcast - data sources, references and the episode transcript are provided below.



Selected Links:


Episode Transcript:

Raakhee: (00:00)

Hello and welcome to Signal Shift with me, Raakhee A few weeks ago, Dr. Jared Cooney Horvath, an educator turned cognitive neuroscience scientist, testified before the US Senate. His opening words, our kids are less cognitively capable than we were at their age. Gen Z is less intellectually capable than previous generations at the same age.


Part of the reason for this, he explained, is the extent of social interaction required for the brain to learn. We learn from other humans more effectively, not screens. The dependency on digital and tech tools in schools increasingly is being questioned now. Just last week, I spoke about Denmark beginning to remove tech-like tablets from schools and returning to textbooks due to similar concerns. Tech is a bit of a problem, certainly with education.


Now, we aren't going to be talking about education per se or learning today, but we are going to be talking about the human side of technology, seemingly second to profits and speedy proliferation. And to give us some insights on this, I would like to welcome our guest today, Andy Sitison.


Andy: (01:14)

I've been looking forward to this all week. I look forward to talking.


Raakhee: (01:17)

Awesome. Thanks so much, Andy.


So a little bit about Andy. Andy is a tech and startup executive with over 30 years of experience. He's currently based in Virginia and he's the founder of Elemental Advising, a company that focuses on data analysis and software development, using and leveraging machine learning and artificial intelligence to help businesses achieve their goals. This includes building high-end web apps that use advanced AI to both comprehend media,and automatically generate new content.


Now Andy is particularly passionate about helping companies create better strategies to enable the human potential within digital transformation using AI. A quote on Andy's profile that speaks to, I think, the heart of our conversation today, Andy, is, "one of our contemporary challenges is balancing the addictive growth of digital technology which is outpacing human levels of learning and innovation in ethics, governance and intent". So, Andy, we have a lot to cover today.


Let's get straight into it. Tell us a little bit about your work and the projects you're doing.


Andy: (02:25)

Yeah, so I, you mentioned I've been doing this a while and I've always been in emerging technology. And what does that mean? I'm working in technology as it's arriving into a populace and it means you have to not only understand the technology before others have had a chance but also to help people understand how to apply that and what does it mean and what are the best values and what are the limitations and possible drawbacks of that technology?


Which has been a lot of fun in a career. And AI is probably one of the crown jewels on that effort, right? Like it is a very interesting moment in technology evolution. And it's full of both sides of love and hate on that. one of the primary things I do right now, I work also for a company called Share More Stories. I'm their CTO. And that's where I do most of my AI work is through Share More Stories.


What we do there is talk to communities and we look for, we use AI to look for evidence of things that are emotive, that are attitudinal based, like joy, anxiety. And so what we are able to do with these technologies is to go into a place and understand what the wants, needs, feelings are of a group of people, that be someone buying products or that be a group in a geography or maybe minority travel in Virginia, which was one actual projects we did work on in the past. So, you know, that's where this is coming in. So I'm right at this juxtaposition between humans and digital.


Raakhee: (03:54)

I mean, there's a little bit in the kind of behavioral science sphere then as well, in essence, in trying to understand human behavior, but you're using artificial intelligence to do that.


Andy: (04:04)

Yeah, it's interesting because I got the psychology degree and thought I wasn't going to use it. I loved the cognitive behavioral science part of psychology and I left the field because I didn't want to be a counselor.


I decided to get in tech and spent 30 years there. And then I took a sabbatical after flying all over the world for a couple of decades. Thought, you know what, digital is coming on strong and I'm not sure if humans are winning, are gonna have an opportunity to stay above the, surf the wave, not be under the wave, right? And so I started looking at it. I started taking a more personal core belief in what does that mean and how do I become part of that story? And in the search of that, I spent four months learning, algorithmic programming, machine learning, AI, natural language, all that stuff that leads into this. back to your question.


Humans matter. They always have. And it showed right back up and right in front of everything we were doing.


Raakhee: (05:06)

And what are you learning from these projects? mean, what is it telling you about our feelings and emotions and behaviors that I think we wouldn't have known without the application of this amazing technology?


Andy: (05:20)

If you look at our projects, we have something we call shared wisdom. What we find is when first off you think about stories in the first place. And I'm not one of these guys that just love stories. It just is a great format for analysis, but it's also a great format for humans because when we tell stories.


We kind of go deeper in our brain stem. know, if a group of people are sitting around listening to someone tell a story, especially a story about themselves, maybe it's a story about when they became an employee or, you know, the first time their grandfather taught them to fish or whatever it is, people quiet down, they listen, they give an empathetic response to the person telling the story. It's natural for us. And the person telling the story feels empowered by that. So when you work with stories, you get that as part of the package already. And that's amazing.


And so when we started diving into this, this shared wisdom, what happens when people start sharing a story in a certain context, they have this bonding experience. There's an enlightenment that happens. And so there's a cool function of that. But then we can go forward and start this. We pull out things like anxiety, joy, self-transcendence, activity level, all of these kinds of scores that no one else is scoring on.


And then we look for evidence. if you take 500 stories and you're looking for evidence in those stories for these different things, and not only the evidence of, you know, anger, fear, sadness, joy, but also the interdependence between the flow, that's called covariance, how data flows together through a data set, right? And how that happens between the different scoring, but also in the different kinds of people that are in that community.


One thing that's popped out as a true wisdom that we found is trust and authenticity travel together. If you don't trust, you're not authentic. And that's a big message for anybody doing market research right now. when people are trusting of a process, they bring more authenticity to the table and they share that with us. Then you can apply that now to all the survey work, all the metadata,Neverything else. If we're not trusting, we're less authentic.


Raakhee: (07:22)

I consider myself a tech optimist. Surely you must be working in the space as well. I mean, in being so much deeper in it as you are, right? Are you scared of it? Do you have any fears around it and our current trajectory based on where we are with some of the infrastructure, governmental, legal? whatever it may be


Andy: (07:43)

I think the quick answer is yes and no, and it may be, right? Gartner came up with the tech adoption cycle and they had the section called the trough of disillusionment. I always loved that name, but it.


It's true. We get through big buzz and then we crash because we learn a little more. What's weird about AI is it was adopted quicker than anything I've ever seen adopted. So more people are involved in the process of the early stages of the adoption of this. And it is also it was adopted very quickly, but at the same time was one of the most fallible products in certain ways to be adopted that fast.


So we have this weird combination moment where we have a lot of people in the space. is, we're pretty immature on how we use this technology. And when I say immature, there's gonna be so many evolutions of this in the next 10 years that it's gonna be of mind boggling. But just the current application of we're still immature in that most of the things people fear about AI is capitalism, right? It's like the deployment, the application of the profit making against that, the, know, whether we're making money off ad revenue or we're investing too much and creating a bubble And showing up on every stupid button on your phone now. three years ago, I had maybe one or two things of AI now there's a thousand.


It's not there to make your life better. It's not there to improve the humanity of the technology. It's there to potentially make a little profit, but more so to win the market share of the space. And that's a classic model. We do this a lot, but it's more damaging, it feels like this time for a few reasons. And so one of the key things about this is how you're applying AI matters.


I'm not blaming capitalism for it. Capitalism is just capitalism. It's a productive tool. So is AI. But we have to think about how we govern it.


I will leave you with, it's also civic structures. Like we're having a major data center growth in Virginia right now because the wires from Europe come over right where we are. And, you know, they're wanting to build data centers that are six, eight football fields wide. And you're seeing tons of money coming in from all over the world, trying to win little counties over.


The problem we have is most of our civic structures, you know, if a person that owns a 1200 square foot house and you're a 100,000 square foot data center, our laws treat you like the same person when you go to access water, you go to access electricity, you know, all that type of stuff. We're a business, a fair business state, right? So there's things that we weren't ready for as humans and as society to deal with this massive growth. And I think all that stuff hitting this all at one time. Hopefully we work through all that. And you don't think the technology is inherently scary. I think it's got as many, it has so many good benefits.


Raakhee: (10:35)

You're absolutely right, right? It's about capitalism. And then it's about these structures, these civic structures often. And, you know, all of that kind of not in the right place and not in alignment at the moment, certainly a lot of work, you know, that's that needs to be done there.


I think you have mentioned wanting to humanize technology, right? It brings the humanity kind of back to it and humanize technology. What does that mean? What does that look like?


Andy: (11:02)

Well, you know, I think one thing we do, as I mentioned about talking to communities and using AI to better analyze stories, and I will just quickly give you, to make it concrete, you know, if we get 500 stories from a community of people and we hand it to a human to read, well, I'll see you next week, right? Like it's a long time to read through that context and their ability to understand the thematic trends and to find the complexity, to find the needles in the haystack, humans aren't good at that.


The technology is great at that, right? So it allows us to use that technology to pull forward the needs and wants of a community so that you can make better products or you can make better policies in your government. And so that is humanizing. So AI can be applied in a way that does make the world more human. And I think that's a key piece of it. There's also technologies that...


You know, they're starting to come out that are considering the, you know, the evaluation of humanity with those technologies. we're hearing the term empathetic AI.


Can you make a robot more empathetic? Absolutely. But it's also more about how our companies, our organizations take these technologies to improve the lives and the people that are involved in that working to improve human lives and to create more trust is not only a good thing, it's good for all of us, but it's also something that's really desired in our community and our populations right now.


Raakhee: (12:30)

I think, you of course, we all want to have empathetic AI as part of the mix, right? And I think it's this I see sort of two sides to the coin and one side being that it's incredible that now we have AI coaches that can help so many people, right? And it's so much more affordable, it's more accessible. And that's great. So I think there's some virtue there.


At the same time, we know the ill effects of some of these relationships or maybe kind of an ill effects, but just things we've never seen before. mean, just a couple of weeks ago, a lady married her chat GPT-AI avatar. She literally had a ceremony and married it, she was really serious about it. And we see more in that, right? People dating somebody who's just an avatar or having these relationships.


So on the other hand, you're like, well, I guess AI is far too empathetic in this case. It's taking over maybe some of these human bonds or relationships. And this in the mix of the loneliness crisis that we face as well. Yeah, I mean, it's a big question. There's no easy answers to that. yeah, how do you, mean, those are the two things that come up for me. So any thoughts on that?


Andy: (13:40)

We need people. We need connection. And this group of young adults.


They're different and in that difference, they're way beyond us, me as a Gen Xer and others, but they're also limited in things. One of the things I like to call physical wisdom, like I grew up, knew how to change a tire, work on a tractor, I work on carburetors, whatever. There's a physical wisdom that's gotten lost over the years.


You know, I think it is a moment where there is a lot of loneliness. And part of the failure of what we're doing is the fact that we're continuing to let that happen. And here's my thinking on it.


Two things are happening. One is it's easier to do everything we do. We have more leisure than we've ever had. And most people would be like, I'm busy. Yeah, you are busy, but you have more. We don't use a horse and a wagon to get anywhere anymore. We don't walk a mile in snow to go to school. And that's even moved before past the eighties and the nineties. And like I used to work,


I worked 80 hour weeks for the first 20 years of my career. I don't work 80 hour weeks anymore. No one does. And so we have freed up because we're more productive than we've ever been. Even though it doesn't always feel like it, we were very productive and businesses are very productive. That has left us with this void of time to fill and a mind that is moving a lot faster than it used to. Like I watched the change in video editing that kids are doing with videos and now they're going in sub-second flashback, you know, I'm like, I can't even see the picture. It's like they're seeing things, they're cognitively processing things at a different speed than I can. I've always been the guy that was too fast for everybody and you know, visually and everything else. So I think that's part of it. We have more time on our hands and the other is human agency.


We kind of lost some of our societal thinking and part of it's because we do so much and it's so much, it's like I'm sitting here in this room with four screens and a phone and two computers and I don't need to leave. I don't go to work. I work here in my house and I'm very productive at doing that but it also limits a lot of my social connections and so forth. we need to think about not only how we personally connect and do volunteer work and everything else so that we're part of something, but we need to think at a structural level about our civic levels, our state levels, our federal levels of what does it mean to be a society and what is right, what is wrong, whether our ethics and morals that matter because


We have more things that allow us to divide and to split and have different perspectives and opinions. We have less things synthesizing those things back together So we have to start thinking what is the new version of all that?


Raakhee: (16:31)

Absolutely. I mean, we could break all of this down and have another hour seminar.


Andy: (16:37)

This is a big topic.


Raakhee: (16:39)

It is, it is. there anything else you do want to touch on or what's the big message you want to share?


Andy: (16:46)

Yeah, I think it's about, I call it the digital veil. When we get in the world, and now with AI doing generative stuff, and there's a lot, and people call it slop. I think some of it is slop. And then you think about marketing and you start to get this uncanny valley, like everything's starting to feel a little fake.


And so the whole world around us is, it's sometimes disorienting. It's almost always disorienting. And, and because we live in a personal and a business world that kind of never stops anymore,


I recommend people get local, you know, get out, use your feet, do something. You can work in a food bank. You can, you know, go on a bike group and you can go kayaking like me and join some people out there work on a house. I don't care. Teach somebody to read.


Do something that gives back, because it's also giving back to you and gives you context. know, we need grounding. Humans need that. get out locally, get to know people


Just find somebody and make that connection do it for one hour a week and then do it one hour a day Whatever it works for you, but put it into your life


Raakhee: (17:53)

I love that. That's a great message. think one we try to share often and I think we all have to start living by a lot more frequently in all our lives.


Andy: (18:02)

I love the moment. It's scary. It's challenging. This isn't the end. This is the beginning in a lot of ways. So it's really about how to look at history and reestablish the parts that really were great about history and figure out what the human aspect of that is. So I look forward to the conversations going forward. If anybody wants to collaborate or talk about anything, I'm pretty easy to find out there.


Raakhee: (18:26)

Thank you so much, Andy. was great having you. To everybody listening, thank you for being here. Let us know your thoughts, what's coming up for you. And until next week, bye for now.


Comments


bottom of page