Moral Artificial Intelligence – China Mapping Pig Faces – Trusting Online Reviews: AS HEARD ON WGAN: [03-06-19]

On This Episode…

Craig is with Ken and Phil, with Phil Harriman covering for Matt. They talked about smart assistants’ ethical code, China mapping pigs’ faces, and can you trust online reviews?

Related Articles

Share This Episode

For Questions, Call or Text:

855-385-5553

TRANSCRIPT

Below is a rush transcript of this segment, it might contain errors.

Airing date: 03/06/2019

Moral Artificial Intelligence – China Mapping Pig Faces – Trusting Online Reviews

Craig Peterson 0:00
Hey, good morning, everybody. Craig Peterson here. I had fun this morning, I was on with Ken Altshuler, and he had a guest host in with him Phil Harriman, he’s a former senator there in the state of Maine. And we talked about a couple of things. And, and I managed to rip him a bit. He’s a big time supporter of left wing causes. I don’t want to call him a Democrat because he’s not really a Democrat. But you know, I got to rip him this morning, we talked about smart assistance and whether or not they should have a moral artificial intelligence and we got into the Chinese tech firms mapping pig faces this morning. It was really kind of fun. You gotta listen anyhow, hope you’re having a great day as well. And we did talk a little bit about online reviews, but not as much as yesterday with Jim so here we go.

Ken Altshuler 0:53 
Always on a Wednesday morning at 7:38. We welcome in our tech guru Craig Peterson. Good morning to you Craig.

Craig 1:02 
Hey good morning. Yeah I’m a little clogged up myself this morning I was just coughing and hacking. Oh my. That time of the year.

Ken 1:10
New England March isn’t the weather like 20 degrees below what it’s supposed to be?

Craig 1:15
I thought it was global warming. Is it not Ken?

Ken 1:20 
It is. No, it’s climate change. Climate change.

Craig 1:24
Oh so when global warming doesn’t work that it becomes climate

Ken 1:28 
Correct

Phil Harriman 1:28
No no no no. Its extreme weather.

Ken 1:32
Yes exactly. Extreme weather. Highs and lows and bad hurricanes, bad tornadoes.

Craig 1:37
Here in Maine, I’m a firm believer in climate change. Absolutely. We have four seasons at least every year. That’s four times a climate changes.

Ken 1:46 
In Maine we have two seasons Craig, winter and Fourth of July. 

Craig 1:51
That’s road repair and winter.

Ken 1:55 
Exactly. So let’s talk about some tests. So I have as you know, Craig, Alexa. And let me, for all you people listening out there, Alexa, pay attention. So we do something with these smart assistants that kind of have a moral code to follow? 

Craig 2:05
Yeah, this is real interesting stuff. And I’m not sure exactly which side of this I come down on. We’ve heard a lot about artificial intelligence, right? It’s been kind of the bane of science fiction writers for ever. Now the carbon project in the early 60s, a big movie that was out all the way through today. And of course, even before that, but artificial intelligence. The whole idea is that our computers are going to be smart enough to learn by themselves make decisions by themselves. And today we have really no artificial intelligence per se, but we do have what’s called machine learning. Where are our computers etc. can learn your iPhone your newer iPhones have a machine learning chip built into them. And that helps them learn a little bit about you. And every time you use your Alexa or use your Google Home device, it’s learning a little bit more about you and what you do and how you do it. So it’s coming. It’s kind of inevitable, we think and nothing’s past what’s called the Turing test yet for those of us that are a little more geeky about this but when we’re talking about our Alexa there’s been a proposal that comes out obviously it makes sense to have your Alexa be able to call the police, it can call your friends anybody that’s in your phone book. My granddaughter used an Alexa we got for them just the other day. She was stuck outside the house she locked herself out. And we had an Amazon Alexa in the house that we’ve given them and she went ahead and yelled through the window Alexa, call mom and of course it did. And Mom said call me back. This is a bad connection. No, no, no, like outside the house until mom knew. And she managed to get home with the key and was able to to let my granddaughter in. Those are all really good things.

But we’ve got some scientists now over in Norway at the University of Bergen spore just speaking last week at a conference out in Hawaii and they were saying that what should happen well they’re saying all of our smart devices should listen to what’s going on in a home and then use artificial intelligence to determine if maybe they should report the goings on in the home to the police, child Family Services etc etc. And this is at a conference of ethics and society in Hawaii. You know Ken when you you’re talking about Orwellian I can’t think of anything more Orwellian than this. Where we have been putting these devices into our homes into our cars now pretty much everywhere in our phones and they’re talking about turning them into spies for who, right? How far can this go? I don’t know Phil and Ken. This really scares me.

Phil 5:36
Well just listening to you describe the power in a device in your home could have is frightening and in I would assume this has got a
an onboarding or in boarding opportunity as well meaning that they can put things into your system that gets communicated into the house or control things that are in your house that you know that you have no influence over?

Craig 5:54 
Yeah, absolutely. Yeah, they could and they can upgrade them. For instance, Google just got nailed because they’ve been selling this home security device, and no one knew that Google had hidden inside of it a microphone that Google just activated this month. So it’s, it’s really, really scary. And we’re talking about an ethical conflict here between people in the family, you know, between mom and dad in the kid, you know what happens when the kid is reaching up to grab a hot pot of water off of the stove, and mom screams at that the kid this is legitimately and now that little device in your home makes an ethical moral decision and report you to the police along with all of the recordings that devices made of anytime that device thinks, well, this was marginal, but it’s not enough to call the police. And now all of a sudden the police have completely out of context, all of this stuff that makes you look really, really bad, you know, and then the ethical conflicts between not only the members of family, but the manufacturer, the shareholders, the manufacturers, programmers, the police department, and, and having these universities say yes, indeed we should have them recording and having other people jump on board and agree with them. I think we were in for some rough road.

But take that to our cars, right. And in in Maine, we’ve got some of these artificial intelligence companies doing some development along with a mapping technology, much of which originated right here in Maine. Right? Who, if you’re in an accident, whose fault is it? Is it yours? Because you own the car? Is it the people who wrote the software? Is it the car manufacturer who hired the people who hired the company that hired the people that wrote the software, you know, Ken, man maybe you should get out of the family lawn into some of the ethical law because for the next 20 to 50 years things are going to be just crazy in that side of the business.

Ken 8:07
We’re talking to Craig Peterson, our tech guru.

Phil 8:08 
Big Brother.

Ken 8:09 
What are you talking about?

Phil 8:10 
It was just frightening what he’s saying.

Craig 8:14
BOC’s on board. I’m sure so it’ll be okay.

Ken 8:19
Is there a reason why you guys are ganging up on me today?

Phil 8:24
Yeah. Because I like you.

Ken 8:26
I think she does a very nice dance on rooftops. That’s all I want to say. We’re joined by Craig Peterson our tech guru who joins us Wednesdays at 7:38. Online reviews. Now, I went the other day to buy a wrist brace for my right wrist. I have something when I played piqua. And, you know, I went to read the reviews because how else do I know which of these products have good? I mean, I assume that those reviews were accurate? Not necessarily, huh?

Craig 8:53 
Yeah, the legitimacy of these things comes into question and it can be a very, very big deal because we’re using them all the time I use them on Yelp. I I go into a new town, I’m I’m traveling somewhere. And I just sought my Yelp app and say, you know, where’s the nice restaurants in the area? And how about you guys? What do you do? Do I tend to not trust reviews? When it’s like, one or two? or five? Or 10, right?

Ken 9:25 
If it’s like, if it’s 400 or 500, I tend to pay attention.

Craig 9:27 
Yeah, exactly. And that makes sense. And there’s reviews everywhere, right? Well, there are some tips of what to do, what not to do when you’re looking at these reviews order to judge the veracity. And there was even a study done on this, like there’s a study on everything nowadays, I think, frankly. But Amazon, Yelp, Facebook, Google, and it is easy for businesses and others to purchase hundreds of reviews within days. So there’s your 400 number can you know that they can all be false. And then the other side of this is businesses will sometimes post negative reviews for their competitors which is another big problem. So when you get right down to it and looking at some of these studies and some 30% according to the study of online reviews are fake reviews. NBC News created a gardening business on Facebook they paid 168 bucks to some online websites that promise suppose positive reviews and for 168 bucks, they got 1000 the likes and a few days after that they got more than 600 5-star reviews.

So I think Ken we’ve got to be careful about this. I personally look at the reviews I look at what’s been written and how it’s phrased and stuff and I try and evaluate it from that but one of the easiest things you can do to figure out if reviews or false is took at the language that’s being used. Because when you do purchase the reviews, you give a sample of what you’re looking for these reviewers, these fake reviewers to say. And oftentimes they’ll repeat it. Phil, you ready? They’ll they’ll repeat it. Just like democrats repeat the morning news bites, as opposed to talking about all day long.

Phil 11:27
Do you like us know, Kenny? I can’t. I can’t let this go any further. Without shifting gears to the China’s tech firms are mapping. Wait for it. Wait for it, folks. They’re mapping pig faces. Alright. Take it away. Take it away Craig.

Craig 11:44
We’ll make this one really quick because I know we’re out of time here. But yes, there is a very big problem in China right now. And it isn’t the pig farts. What we’re talking about is disease. And of course there’s a lot of trans-species disease stuff that can happen. Certainly with birds. Most of our flus come from birds. But in this case, here’s what’s happening. China, China has been using facial recognition technology for a long time to spy on its own people. They’ve been doing that a lot in London, as well as other places in the UK. But right now what they’re trying to do is track the pig farmers because many of the small pig farms are polluting the environment. Yes, indeed, that is the truth. An AOC thing was a bit of a joke there, but polluting the environment. So they want to keep track of the pigs where they came from, what diseases they might have, and they’re doing it with facial recognition technology of the pigs as well as they’re listening to the pigs conversations in order to determine if a pig might be sick because apparently pigs talk differently when they’re not feeling well.

Phil 12:57
So Craig, as we move  on from this point forward when we refer to AOC we can also refer to KPA. Kenneth P. Altshuler.

Ken 13:09
Craig Peterson joining us. He joins us, Wednesdays at 7:38. Craig, thanks for joining us. We’ll talk to you next Wednesday.

Craig 13:16
Hey Ken, thanks for being a good sport. It was kind of fun.

Ken 13:20 
I don’t mind all. Thank you guys. We’re gonna take a quick break.

Craig 13:26
Hey, I released module three yesterday we had a great coaching call, live coaching call yesterday as well for everyone in the course. So shout out to you guys. You should have gotten Module Three, let me know if you did not. And then three, of course, we’re delving into network security stuff, what you can do and how to do it. Take care guys, I’ll be back tomorrow. I’m going to do a couple of security things this week on. One on Thursday, one on Friday, so keep an eye out for those as well in the podcasts. Thanks again. Bye bye.