Skip to content
March 28, 2023 · 24 min read

Season 3, Ep. 17 – Using AI to find unbiased news, with Pat Condo, CEO of Seekr

What if there was a way to understand how biased your news source is? That’s the problem Seekr’s CEO, Pat Condo, set out to solve. This week, Faith sits down with Pat to talk about how to promote truth in reporting, his rich history in working with the defense and intelligence community, and that time he helped to launch a space shuttle.


Read transcript


Faith (00:05):

Hi, Pat. How you doing?

Pat (00:06):

Good. Sorry about last time.

Faith (00:08):

Oh, that’s all good. Yosha and I were joking that, just because I work in technology, my family thinks that I can fix the WiFi, make their headphones connect, fix their printer, (Pat: Yeah.) and that’s just not the case. So I understand the technical difficulties.

Pat (00:24):

Yeah. You know what? I travel a lot, and I’m in various WiFi scenarios. But anyway, nice to meet you.

Faith (00:33):

Nice to meet you, too. I’m so excited to chat today. (Pat: Yeah. It’ll be fun.) I think this is gonna be a really awesome conversation.

Pat (00:38):

Yeah, it’ll be fun.

Faith (00:39):

I’m so excited to have Pat Condo on the Frontier podcast today. Pat is a founder and CEO of Seekr Technologies, which is a search engine that uses AI and ML to learn and analyze news articles to spot incorrect biased or false information.

Seekr commercial actor 1 (00:57):

(UPBEAT DRUM AND BASS MUSIC PLAYS) With all human knowledge now in the palm of our hands, everyone thinks they’re an expert. And this is how my family reunions have been lately. (WINDOW CRASHING) My family has very distinct views and sometimes these family gatherings can get (PEOPLE TALKING LOUDLY OVER EACH OTHER)…out of hand.

Seekr commercial actor 2 (01:14):

(PEOPLE CONTINUING TO TALK LOUDLY OVER EACH OTHER) You cannot deny the experience of the flatness!

Seekr commercial actor 3 (01:18):

(PEOPLE CONTINUING TO TALK LOUDLY OVER EACH OTHER) It’s on the internet, so it must be true, right?

Seekr commercial actor 1 (01:20):

(PEOPLE CONTINUING TO TALK LOUDLY OVER EACH OTHER) The truth is, with so much information, how do you know it’s reliable? That’s why I use Seekr. Seekr is a transparent, independent search engine that cares about users’ privacy. And Seekr gives you balanced information with reliability you can trust. So when your cousin can’t stop arguing about that one theory, help them seek common ground. (COMMERCIAL AUDIO FADES OUT)

Faith (01:41):

And Pat, this is not your first company. Pat has founded four tech companies, two of which, today, are publicly traded, and he has provided advanced search capabilities to the defense and intelligence communities. That’s interesting. We actually, we should talk about that. That’s what I thought I was gonna be doing for a living. He’s been an executive at DEC, Harris Semiconductor, and Northrop, where he has had a hand in two of the largest technical programs in US history, which are the MX missile program and the space shuttle. That is so cool <laugh>.

Pat (02:12):

<Laugh>. I know. Everybody loves that. They don’t care about anything else.

Faith (02:16):

Do people often tell you that they’re starstruck when they’re talking to you?

Pat (02:20):

Well, that’s ironic, because <laugh> it’s kinda funny.

Faith (02:26):

<Laugh>. I’m a sucker for a space joke.

Pat (02:29):

Yeah. So no, that’s always the first thing they always ask is they could care less about everything. And they say, “Is that really the case?” And actually, it was. It was fascinating. I mean, it was one of those things where, you know, you just have, you’re just at the right place at the right time where I did something that was valuable to, you know, figuring out navigation systems. And at the time, they didn’t have the computing capability they had today, so figuring out how to guide a rocket 8,000 miles and “get between the goalposts”, they used to say, of a city or whatever, or take a rocket or a space shuttle to the moon. How did you do that without the computing power you had today? So there was a lot of, you know, mathematics.

Pat (03:16):

There was a lot of, you know, interesting experiments that were tried. But yeah, I did end up working on the space shuttle, and I saw its first launch, which was (Faith: Cool.) actually spectacular, because, they don’t do this anymore, but you used to be able to get within a mile of where the launch would occur. (Faith: Wow.) And what you would see is that, as the rocket took off, for the very first time, it’s strapped to the back of a Saturn missile, it’s like 600 feet tall, you know, it’d be like an earthquake came, then there would be like, a sound wave hit where you would just be blown back, and then when it, but the part that everybody laughs about is when I tell the story of when the huge fuel tanks fell into the ocean, it created sort of a tidal wave back onto the beach where there was millions of people, and it was all filled with dead fish, ‘cause they were boiled alive by the tanks.

Faith (04:13):

Oh my god <laugh>.

Pat (04:14):

<Laugh>. Anyway, so when people…

Faith (04:16):

That sounds just traumatizing on so many levels. I can’t imagine <laugh>.

Pat (04:20):

Yeah. But that was my experience <laugh>.

Faith (04:23):

<Laugh>. I would be scared to see whatever you had to sign to say that, you know, you wouldn’t sue them. That is, that’s crazy.

Pat (04:30):

Yeah. Today, I mean, oh my god, everybody would be up in arms over there, but it cracked windows for 60 miles.

Faith (04:37):

Wow. That’s wild.

Pat (04:37):

That’s how, powerful like, the blasts were. It’s all more refined now, but it was quite an experience, and yeah. I was like, 27 years old, so it was fun.

Faith (04:47):

Oh my gosh. I mean, I’m not too far from 27, and I think I’m just old enough to say that’s not fun for me. (Pat: <Laugh>.) But I also have a thing with space. It’s the joke here on the team that you can’t talk to Faith about space. (Pat: Really?) Which is weird, ‘cause I’ve got like, a UFO poster in my office.

Pat (05:04):

Yeah. Oh, there you go.

Faith (05:04):

It just makes me, I think, ‘cause I’m so selfish, that the idea that the universe is actually massive, and I don’t matter, makes me, it gives me the heebie-jeebies. So…

Pat (05:13):

So there’s one question they say people have always asked since the beginning of time: (Faith: What is it?) “Are we alone?”

Faith (05:21):

That’s true.

Pat (05:22):

What’s your answer?

Faith (05:24):

You know, I don’t think it’s possible that we’re alone. Right? There’s gotta be something. This took a turn, Pat. I wasn’t expecting to talk about aliens, but…

Pat (05:35):

No, no, no. We’re not talking about aliens, because I think it’s just an interesting question about, you know, the constant drive, (Faith: Mm-hmm <affirmative>.) the curiosity that people have that drives the science, that drives everything that you have and do today and where you’re gonna go. It’s without that curiosity, there’s nothing to drive you, and I think that’s what, you know, a really important part of anything that, you know, if you’re gonna be in this kind of a environment where you’re pushing the envelope on technology in any dimension, it’s that curiosity factor, and I think that’s one of the things that humans have that I always look for.

Faith (06:18):

Speaking of curiosity, you’ve seen a lot in your career, obviously not just what we’re talking about, but (Pat: Yeah.) through that career, what are some of the things that you saw or worked on that inspired you to pursue what you’re now building at Seekr?

Pat (06:35):

Well, there was two things. So one was when I first started out building search platforms, I ended up in, somewhere around ‘97, I ended up purchasing a company in Columbia, Maryland that had about 30 scientists and technologists that worked for the NSA. (Faith: Hmm <affirmative>.) And at that point in time, I began to understand quite a bit about how intelligence around the world operated, and how all the different threats that existed that people in this country weren’t really aware of, and they go about their daily business not knowing, you know, what the rest of the world thinks about them. And so I built that company, and in four years, we sold it to Intel Corporation, and I thought that was it for me. I thought, wow, that was really great. We sold it for like, $1.2 billion, when that really mattered, (Faith: Mm-hmm <affirmative>.) and then–

Faith (07:39):

Billions still matter to me, Pat <laugh>. They still matter to me <laugh>.

Pat (07:42):

<Laugh>. But then what happened, of course, is we had the dot-com meltdown, and then 9/11. And 9/11 is what really affected me from that point forward, where I then started a second company, that became public, that was one of the largest technology companies, you know, in the defense and intelligence world, in search. And so if you think about Google as being kind of the front door, we were a Google that was in the back door, where we were just sucking everything in and analyzing everything, and allowing people to do investigations, and take action to, you know, to protect the country. (Mmm <affirmative>.) But what I learned in all of that was that there’s an offense and a defense, in terms of, you know, information warfare.

Pat (08:37):

And the offense that that was being deployed, that we were deploying was how do you identify terrorist organizations? How do you attract them? How do you destroy them? And there’s two ways: there’s the psychological way and there’s the physical way. And usually, the psychological way was far more effective, and what you would do is you would go in, and you would identify, disrupt, create chaos, cause confusion, have them turn on themselves, and eventually it would collapse. (Faith: Hmm <affirmative>.) So I learned a lot about how that all worked, and where it took me with Seekr is, if you fast forward to 2016, 2017, the same exact technologies are being deployed against Americans, (Faith: Right.) and those were things like, that came from Cambridge Analytica, which was a psych ops operation that was actually, you know, breaking up different intelligence cells around the world that was being used on Americans.

Pat (09:33):

And I said, “You know, people ought to be aware that these techniques, these psychological techniques are being deployed, and people are not aware.” (Faith: Mmm <affirmative>.) And it really wasn’t until you know, I could tell people all the time, and everyone would say, “Oh, that’s a very small percentage,” until then that Netflix show The Social Dilemma came out, and, all of a sudden, people started to say, “Wait a second. Look at this; look at that.” And people from Facebook, and people from Google, and people from other, you know, Twitter, et cetera, began to talk about how they used what, I would call, your “cognitive bias” to drive you to pay attention, or to have a reaction, or to create an emotion, which would then, you would confirm over and over and over again, and then I could sell you whatever I wanted to sell you, (Faith: Mmm <affirmative>.) right?

Pat (10:24):

And I’d know more about you than anybody close to you. At some point, I could predict what you were gonna do, (Faith: Right.) and that’s how those technologies were. They’re fine in a rational world, but what people don’t realize is that outside state actors, nation states, adopt the same kind of technologies, far more advanced. How do they use those? And what we see over the last five, six years is that we have, what I would call, a “payload” of news and information that’s supercharged with an emotion, and that emotion’s been anger, (Faith: Mm-hmm <affirmative>.) and that’s been used to divide the country. And the more and more that that anger emotion can be charged in a headline, in a story, in a broadcast on social media, the more it accomplishes the objectives of countries that don’t, you know, wish that our democracy exists, and it’s growing at an ever increasing pace.

Pat (11:24):

And my fear is that these companies are so big, that have business models depending upon this, that, you know, that it gets watered down, and that’s been the case for years. But if you look at, in the last year, any poll that you do shows more than 50% of the people do not believe the news. (Faith: Mm-hmm <affirmative>.) Fifty percent of the people don’t trust their health organizations, and 50% of the people don’t trust the government. Now you’re getting into a place where, if you step back a little bit, those are the moments in times where revolts, revolutions, dictatorships, different kinds of things occur. And I’m not saying at all that that could happen, but we’re increasingly going towards more chaos than less. (Faith: Right.) More disorder than order, more where there’s no toleration, and where there’s a separation, and that’s not good. (Faith: Right.)

Pat (12:21):

It doesn’t take long where it progresses at a rate that becomes uncontrollable. And part of what I think is really interesting is that we’ve now taken, what I would call, I don’t know, a “cracked foundation”, and instead of trying to fix it, we’re building a skyscraper on it called ChatGPT, where you can create an unlimited amount of content that you don’t know the source, (Faith: Right.) you don’t know the depth, the breadth, you don’t know. And now there are derivatives that are coming from countries that are building these on platforms that can, you know, just completely flood an already polluted infrastructure. And, you know, people like us, we have critical thinking skills. But how about young people? That’s my concern.

Faith (13:13):

Yeah, and it’s a founded one, right? I mean, the question is, people feel more comfortable when they’re basically in an echo chamber, right? (Pat: Yep.) And, but we know that, in order to maintain our social fabric, it’s exactly what you just described. People need to be exposed to unbiased information. They need to be exposed to opinions that differ from their own, and that’s really uncomfortable, (Faith: Yeah.) and we don’t like it. Like, not only are are corporations incentivized to create these echo chambers, but humans are incentivized (Pat: Yeah.) to stay in them. And so my question for you is, what can be done to promote the use of tools like Seekr to combat that kind of behavior?

Pat (14:01):

Well, so there’s two aspects to it that, I think, one is that, in this age that we live in, somebody like me is not popular. Not, you know, a hundred million views on Instagram. And so–

Faith (14:18):

Not yet, Pat.

Pat (14:19):

Not yet, but <laugh>…

Faith (14:20):

<Laugh>. We’ll get Yosh on it <laugh>.

Pat (14:22):

But you have to find those kind of people, because that’s how you communicate with the huge audience that exists. They, for the last 10 years, they have been the people that bring the trends, the people that bring the opinions, the people that kind of form young people’s, I dunno, points of view. (Faith: Hmm <affirmative>.) They’ve gone away from traditional, you know, existing paths, and they’ve gone more towards that, and they find themselves following those people, and that’s why they’re called followers, right? (Faith: Mm-hmm <affirmative>.) And so the, so what we thought was, how do we find people that have a big voice but are not political? ‘Cause that’s not what we want to be. It’s easy to find political people, but that’s hard to find somebody that’s looking just to be…educate the masses on media literacy or trying to be as unbiased as possible, and so we picked two people to start.

Pat (15:17):

One is we picked Tony Robbins, who, more about a massive audience of business and marketing, but has no political overtone to it, and he’s got literally tens of millions of followers. (Faith: Right.) On the other side, we said, who can we get that young people would follow? But again, not political. And we’ve signed up with an adventurer on Discovery Channel who’s very popular, called Bear Grylls. (Faith: Cool.) And so Bear Grylls has about 70 million kids around the world that follow everything he does, and so what he does is, he’s always built physical toughness, and, you know, overcome a challenge in the wild, and survival skills, now he’s focused on the mental survival skills, mental toughness, and how to, if you feel something’s not right, examine it yourself. Take a look around, (Faith: Mm-hmm <affirmative>.) look at a few sources, don’t fall into the echo chamber, don’t be bullied, because that’s what’s happening today.

Pat (16:21):

It’s like if you don’t fit, you don’t fit. And (Faith: Right.) it’s even more so now than ever before. And social media can be the biggest, you know, it can be a huge benefit, but it can be the biggest harassment tool on the planet, too. (Faith: Mm-hmm <affirmative>.) So we picked these two ends of the spectrum. One to get corporate CEOs and the corporate audience, and the other to get young people, and they’re gonna bring out Seekr. They’re gonna start talking about it. They’re using it everywhere, and we’re more and more gonna roll out programs, because that allows us to get up above the fray a little bit and start to get noticed. And so that’s one of the things that we’re trying to do, and we’re trying to find other people like that, and I think we’ll find them, because once they start to see what we do and what we’re trying to accomplish, I think there’ll be quite a bit of interest in it.

Pat (17:10):

And we’ve got this election coming up, and I don’t know if you remember, before the last election, there’s a whole host of people that think, oh, Silicon Valley is pushing the vote one way or the other, (Faith: Mmm <affirmative>.) so they went out, and they looked for alternatives. They couldn’t find many. (Faith: Mm-hmm <affirmative>.) And so they found DuckDuckGo, which went from kind of obscurity, all of a sudden everybody focused on it. (Faith: Billboards, yeah.) Yeah. All that, right? They went to Telegram, which was like, you know, most people had WhatsApp; all of a sudden, Telegram blew up, and (Faith: Mm-hmm <affirmative>.) you know, so these kind of, and then Twitter went the other way, so all these different things have occurred because people are looking for…yeah, the extremes are looking to get, you know, the 20% of the people want to be 80% of the voice, (Faith: Mmm <affirmative>.) but there’s more people in the middle, more people looking for choice.

Pat (18:07):

More people would rather say, I want to know what’s credible, not what is politically charged or what…yes, I could be even in an echo chamber, myself, but I like to look at a lot of things. I don’t just believe one thing anymore, (Faith: Right.) and even, you know, I have these discussions all the time with my brothers and sisters, or my wife, or whatever. They’re all like, “Well, how can you take a look at what CNN is saying?” or “How can you look at what Fox is saying?” I said, “Because there are some semblances here that each side is right about something.” Now, I know what the deal is. They only show what they want to their audience, (Faith: Mmm <affirmative>.) but if you thread a couple of these together, sometimes you get the big picture, and it’s a very different picture.

Faith (18:46):

Right. Yeah, and it’s, you know, it’s such an incredible tool to just develop empathy (Pat: Yeah.) for folks. And in a country that’s become so stratified, where we assume that if someone doesn’t agree with us, it’s because, well, they must be crazy <laugh>. Right? So being…I appreciate using Seekr, because I can see, okay, here’s the story that I read. What’s the story that somebody else read? (Pat: Yeah, yeah.) And how are we both approaching this true event from two very different understandings of what actually happened?

Pat (19:19):

It’s amazing, isn’t it? (Faith: Yeah.) I mean, I’ve picked some of the biggest issues over the last couple of years, and I almost think I’m in a different country. Even the, you know, we did quote extraction, (Faith: Mmm <affirmative>.) because one of the things that is very popular in disinformation is to distort a quote, (Faith: Mm-hmm <affirmative>.) and quote extraction is really hard to do, but we were the first persons to do it, because we felt it was important, because a quote isn’t always one sentence, it’s not one punchline, it’s three sentences, it’s four sentences. The quote is very big, and most of the time, they don’t get covered, and by taking the first sentence and the third sentence, I can make the whole thing different.

Faith (19:58):

Yeah. I’ll have to, when we’re done recording, Pat, I’ll tell you about the time that I was misquoted on Lester Holt.

Pat (20:04):

Really? (Faith: Yeah <laugh>.) Wow. And sometimes it’s on purpose, by the way. It’s not really, it’s for the audience.

Faith (20:12):

Exactly. Yeah, and it’s about making the story buzzier. (Pat: Yeah, that’s it.) So, I mean, we’re recording this at a really relevant time, because no matter where you look, there’s AI and there’s ML in the news, there’s been a lot, you mentioned ChatGPT, over the last three months, a lot has become accessible to the general public, and with that is a more widespread understanding of like, you know, this stuff is really cool, but we have to be very careful about how we’re training AI and ML to absorb human biases, right? (Pat: Right.) And you mentioned, I hadn’t thought about, you know, the intentional use of these tools to drive public sentiment in a biased way, so I’m curious if you see a future where AI and ML tools, which, of course, Seekr is based on AI and ML technologies, where they’re regulated to reduce bias. Or do you think that that will always be up to the individuals building the technology to address that?

Pat (21:16):

You know, it’s a question that has constantly been in the industry. If you go all the way back to the motion picture industry, (Faith: Mmm <affirmative>.) where there was a time when, what is the definition of an R movie, versus an X movie, versus PG, PG 13, and nobody could figure it out. And so people were upset about, “Oh, I don’t wanna see nudity. Oh, I don’t care. It’s not nudity,” on and on. And the government threatened to regulate it, so the movie industry was smart enough to say, “You know what? We better do this before the government regulates, (Faith: Mmm <affirmative>.) because once the government regulates it, our freedoms are gone, and the creativity starts to go away.” And pretty soon you know, that becomes censorship. So I think where we are is a very similar point. Music’s the same way.

Pat (22:09):

So all the creative discourse, you know, you recall people banning books, and on, and on, and on, right? All these things are happening, and I think where we are today with ChatGPT is if we, the industry in AI and machine learning, we don’t regulate ourselves, it will get regulated, and once it gets regulated, it changes everything. And I think that there’s a point in time where, okay, there’s sort of new technology bursts that occur. They’re evolutionary, they slowly come out, then they all of a sudden accelerate. And before they hit steady state, there’s this, the issue about what problem do they solve? What kind of issues do they create? And that’s the point to catch it, because once it starts to exaggerate and starts to get out of control, once you bring the government in, and we can see, now, that on both sides of the political spectrum, everybody wants to regulate something. (Faith: Mmm <affirmative>.)

Pat (23:13):

And it’s like, if the industry doesn’t adopt something, then it’s gonna squeeze the creativity out of it. So I think that, you know, phase one is always this burst of “here’s all the new stuff”, and then phase two is, “oh my god, how do we deal with it?” And I think we’re rapidly approaching that phase, and I think that’s where Seekr can come in, where we don’t own one of those networks, so we’re independent. We are a commercial company, so everybody knows what we’re doing. We’re not a government agency, we’re not a nonprofit, there’s some that really isn’t, you know, nonprofit, and we’re not funded by any political group or any of those kind of things, so I think the timing for all this is really good. I think if it goes too much further, and we start to see the election approach, my prediction is all sorts of nation states will start to really confuse everything, and then the government’s gonna go, “You know what? We have a huge problem here. You know, this is like, a national security issue,” (Faith: Right.) and we’re gonna, we’re gonna do all these things, then it’s gonna change everything. So I think it’s gonna, I think it, we’ll see something, you know, happen.

Faith (24:26):

Right. The other piece of regulation that isn’t maybe government mandated, but, you know, Seekr adheres to journalistic standards, (Pat: Yeah.) and I think anytime you’re building in that space, you’re gonna have some challenges with, you know, how do you address that? And so I’m curious, what has that experience been like, building Seekr and trying to do so in a way that adheres to those standards?

Pat (24:49):

There’s kind of two pieces to this. One is that we looked at standards in journalism, and there was some, you know, there’s dozens of them that are just extremely sensible, (Faith: Mmm <affirmative>.) and we said, “You know what? Let’s build a technical model of those.” So things like, does the headline match the body? Is there a byline? Those kind of things were, you know, the technical structure. Is the spelling right? All those kind of things, right? Because all those sort of lead to credible, not credible. But then, the part that we wove in there that’s new, is there are about 350 known cognitive biases that people have. (Faith: Yeah. Wow.) How do you take those and create them in a software program that then combines with these journalistic principles? And when they see a particular article, they all fire off and they say, “I see the presence of these things.” (Faith: Mmm <affirmative>.)

Pat (25:51):

And so the presence of something like, okay, the first step, does the headline match the body? Half the time, it doesn’t. But then you see things like, hey, are there ad hominem attacks? How many unknown sources are cited? Is there confirmation bias built in? Is there some other bias, gender, or religious, or age? How do you pick all that up? Are there certain words that are dog whistles (Faith: Mmm <affirmative>.) for certain groups? And before you know it, you start to see the presence of these things, and anyone, any journalist would say, “Oh my god. Well, a presence of all those things would indicate really poor journalism, or, you know, something that is just not credible.” Right? Because it’s the person who’s trying to influence you in ways that don’t conform to the presentation, itself. They’re just false. It’s a false narrative. (Faith: Mm-hmm <affirmative>.)

Pat (26:42):

And so we’re trying to identify, we go from two steps of proto narrative, which is, here’s an article, (CLAP) here’s what it looks like, to now a moving narrative, which is far more complicated, which is you and I are having a dialogue, and occasionally we veer off path. And how do I know that I’m not taking you down a path to now convince you of something that you hadn’t really thought of before? And now I’m swaying you in a particular dimension, and you are sitting there saying, “You know what? I never thought of that. I think I’m gonna go there,” and then your next caller you get on with, and you convince them (Faith: <Laugh>. Right.) to go down that path. Now, how do I track that narrative? Because that’s one of the more destructive things that nation states can do, (Faith: Mmm <affirmative>.) and that’s where things are headed. So we look at static, but then we look at the moving narrative, and we look at it spatially, we look at frequency, we look at all that.

Faith (27:38):

Pat, I gotta tell you, a million years ago, I studied international relations and Middle Eastern studies for my undergrad, (Pat: Mm-hmm <affirmative>.) and I feel like I’m getting a refresh of just about every seminar I took (Pat: <Laugh>.) on nation building and democratization. Final question….

Pat (27:57):


Faith (27:58):

Yes, exactly. (Pat: Yeah?) Final question for you, Pat, are you from New York?

Pat (28:04):


Faith (28:05):

Boston! Man. I’m from Buffalo, (Pat: Okay.) and I feel like I can never get the Long Island and Boston accents straight. Are you still based in Boston?

Pat (28:16):

No, I live in Northern Virginia now. I moved down when I took over that company that was involved in defense and intelligence. So I’ve been here for 27 years.

Faith (28:26):

Wow. I’ve got a lot of people in my life from NoVa. (Pat: Oh, really?) Yeah, including Mr. Faith, who now lives here in Nashville with me <laugh>.

Pat (28:34):

Oh, very cool. Yeah, you know, it’s an incredibly growing area (Faith: Yeah.) for, you know, so it’s been good.

Faith (28:41):

It’s cool. My aunt and uncle are up there, and they just, they bought a ski place in Pennsylvania, and I’m like, (Pat: Oh, really?) cannot believe that’s what we’re doing.

Pat (28:50):

Not much skiing in Pennsylvania.

Faith (28:53):

No, no. That’s what I hear. Pat, thank you so much for joining us. (THE FRONTIER THEME FADES IN) This conversation has been so interesting. Like I said, I feel like I’m back in college. (Pat: <Laugh>.) If folks listen, and they wanna reach you or the team at Seekr, where can we direct them?

Pat (29:09):

You can direct them to, let’s see,, where they can download the product, or they can go to Yosha for, you know, at Marathon, or they can reach me at [email protected].

Faith (29:23):

Excellent. All right. Well, we’ll include those resources in the show notes. I really appreciate it, and thank you so much. We’ll see you next time.

Pat (29:30):

Yeah, this was fun. Thank you.

Faith (29:33):

Thanks for listening to The Frontier podcast, powered by We drop two episodes per week, so if you like this episode, be sure to subscribe on your platform of choice, and come hang out with us again next week, and bring all your internet friends. If you have questions or recommendations, just shoot us a Twitter DM @theFrontierPod, and we’ll see you next week.