Poem: "Sufficiently Advanced Reality"
Aug. 6th, 2019 11:06 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
This was inspired and sponsored by
mdlbear. It also fills the "pictures" square in my 8-2-19 card for the End of Summer Bingo fest.
"Sufficiently Advanced Reality"
If cyberspace is the Jedi Tree,
containing only what you bring into it
then augmented reality is fairy dust,
enchanting everything you sprinkle it on.
Every picture offers possibilities.
You can trap and train monsters,
name plants you don't recognize,
and even identify human emotions.
Sufficiently advanced reality
is indistinguishable from fantasy.
* * *
Notes:
Augmented reality adds an additional overlay of information onto the physical world.
The Jedi Tree, or Force Tree, contains nothing but what you bring into it. All it does is turn your darker impulses against you.
Pokemon Go is a game that uses GPS to serve localized information to players.
Some AR programs can name plants or other objects. This has much more serious privacy concerns because face recognition and emotional analysis are things that many people do not consent to having used on them.
![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
"Sufficiently Advanced Reality"
If cyberspace is the Jedi Tree,
containing only what you bring into it
then augmented reality is fairy dust,
enchanting everything you sprinkle it on.
Every picture offers possibilities.
You can trap and train monsters,
name plants you don't recognize,
and even identify human emotions.
Sufficiently advanced reality
is indistinguishable from fantasy.
* * *
Notes:
Augmented reality adds an additional overlay of information onto the physical world.
The Jedi Tree, or Force Tree, contains nothing but what you bring into it. All it does is turn your darker impulses against you.
Pokemon Go is a game that uses GPS to serve localized information to players.
Some AR programs can name plants or other objects. This has much more serious privacy concerns because face recognition and emotional analysis are things that many people do not consent to having used on them.
(no subject)
Date: 2019-08-07 05:29 am (UTC)This is my job.
This is what I do.
The imposition of Order on Chaos.
System on Fact.
Classification on Data.
I measure time in bits, bytes, kilobytes, megabytes, terabytes.
The future has teeth.
And you're seeing all of your worst fear about technology come true.
Not only is big brother watching you,
He knows your PIN number.
Welcome, my son, to the machine.
Your fate is being decided by forces you can't even begin to comprehend
and you feel like a hairless pink fetus
floating in a Plexiglas bathtub
somewhere deep inside The Matrix.
But don't worry, Coppertop.
It's cool.
-- "Technical Support", Ernie Cline (relevant excerpt)
Well ...
Date: 2019-08-07 06:28 am (UTC)It won't be much longer until the authorities have become so dependent on machines to do all the work for them, they'll forget how to do it themselves, and people who aren't dependent on machines will be able to ghost the hell out of them. It's already started, and they've done it to themselves simply because the growing inequality makes it impossible for some people to use the system in the first place.
Re: Well ...
Date: 2019-08-09 11:09 am (UTC)Re: Well ...
Date: 2019-08-09 07:42 pm (UTC)(no subject)
Date: 2019-08-07 05:12 pm (UTC)This is probably not what you meant, but it seems to me that many people actually expect those things and get upset if you don't do them - but they also expect you to do it in your head instead of with help (even if you need that help to do it at all).
Personally I don't think there's any real difference - other than the tech variant being easier to abuse, primarily due to cost.
In my opinion, given the above expectation, personal (offline) use to recognize people you know should be fine - the problems occur when you start sending the images or results online, or using it en masse to build large databases of people. Those problems would still be there if the task was done by hiring masses of people to do the job instead of using tech.
The tech can do it much more cheaply, though, which makes it more accessible to unscrupulous people, and once they do some bad things with it... Well, people have a tendency to blame the tech rather than the ones who misuse it.
Thoughts
Date: 2019-08-07 11:23 pm (UTC)It is different when a person does those things than when a computer does them. You are right that people expect humans to do them and become aggressive if it is not performed to their standards. But using a computer changes many aspects of the interaction. Problems include but are not limited to:
* Once data exists in electronic form, it is no longer secure. All you can do is make it more challenging to access. Hackers break into "secure" data every day.
* If data is recorded and transmitted, it becomes even more of a hazard.
* All the programs I've seen are trained on "normal" people. That means they won't know how to parse other types of input accurately. We've already seen massive problems when algorithms are trained on white faces and can't handle black ones, or men and can't handle women, etc. But the user will be trained to treat the program's information as "real" and will act on it as such. This is a serious problem if it is inaccurate. And the user can't backcheck it because if they could do that then they wouldn't need a program to read emotions for them.
* Technology is less reliable around some people. This used to be a minor nuisance but is rapidly becoming a major handicap. And unlike most handicaps it doesn't just affect the person who has it. It affects them plus everyone inside their area of effect. I know not to handle other people's electronics. But if the people around me routinely carry lots of fragile stuff, some of it's going to break. They won't know why. I won't notice unless they start swearing at it. If I back away, maybe it'll revive or maybe it's a brick. With a phone this is aggravating; with adaptive equipment it can be disabling. And I'm not the only one with this trait.
* The issue of consent is real, but not respected. You don't have a right to do things to people against their will. But cyberspace is predicated on doing exactly that. So as things like face recognition and emotion reading roll out, people will be told they have no right to object and it will be used on them whether they like it or not. Some will simply withdraw from public life, which is not a good outcome. But others will beat the shit out of individuals wearing tech they find offensive, which is already happening to cyborgs and people with Google Glass. That's a different kind of problem. They both stem from a disagreement over who deserves the right to go out in public.
Based on the bad decisions I see society making about other applications of technology, and the problems caused, I expect this to go in similarly troublesome directions. It could be used responsibly. It does have potential benefits. But the way people are behaving, I think the harm will greatly outweigh the benefits.
>>Personally I don't think there's any real difference - other than the tech variant being easier to abuse, primarily due to cost.<<
Some people may consider human reading and machine reading of emotions to be the same. Most don't seem to feel that way.
Also, it doesn't matter whether the machine is on a person or on a kiosk. It's the machine reading that people object to. Another touchy topic is the rise of kiosks, mostly for advertising, that use cameras to read faces, identify emotions, and target users for different material based on that information. Which pisses off everyone who hates targeted advertising or other intrusive manipulations, and that's a lot of people.
Now imagine that they're pushing down their desire to fight back because they know a machine won't care ... and then they run into a human wearing such a machine. That situation's going to get real ugly real fast, and I don't want to see that happen. Boundary violations hurt people, and even if those people are told they aren't being hurt and have no right to protest, sooner or later it's going to come out.
And the ones who don't realize they have boundaries because those have been violated so much the sense doesn't even develop? They're the tweens sexting each other and the adults writing their password on their desk, which also drives society batshit.
>>In my opinion, given the above expectation, personal (offline) use to recognize people you know should be fine <<
It is fine if you use it in a controlled environment, such as your home, wherein all the people present have consented to its use. In this regard it would doubtless make life easier for many families. But when you leave that closed location and enter the public sphere, many other people are present who have not consented to participate. Then people's rights come into conflict. There's no easy answer to this one that gives everybody everything they want. The best I can think of is to sort by location, as we do with many other objects -- they can be used in some places but not others. Then you're down to places people HAVE to go sometimes, like courthouses, which usually means somebody's rights get stepped on, and the question is whose. Sorting isn't always feasible at that level of granularity, although people might manage if they care enough to map routes and reorganize what goes where or is done when. In a society where only a handful of stores even offer quiet hours, I don't see that happening.
>> the problems occur when you start sending the images or results online, or using it en masse to build large databases of people. <<
The problems certainly get worse then.
>> Those problems would still be there if the task was done by hiring masses of people to do the job instead of using tech. <<
Now there's a fascinating question. I'm not sure of that. Would people care more that it's a human doing the reading? Or would it be trumped by the fact that hardware and transmission are required? Can observers even tell the difference? Almost certainly not, unless people deliberately flag what they're doing.
These issues matter because another service is just that: sighted people look through a camera worn by a blind person and describe in detail what is there. Blind people have found this extremely useful. Unfortunately it's very expensive and not subsidized. A most excellent solution would be pairing up blind people with mobility-impaired people for this purpose, as Terramagne does. They aren't having a problem with it. But they have much better privacy protection and thus a high-trust society. I'm not sure it would be tolerated here, even though the practical aspects would work as well.
>> The tech can do it much more cheaply, though, which makes it more accessible to unscrupulous people, and once they do some bad things with it...<<
Those are more problems.
>> Well, people have a tendency to blame the tech rather than the ones who misuse it. <<
Because you can see the tech at a glance and recognize it as a risk. You usually won't know the person, whether they are good or bad; and you certainly won't know what program is running, whether the data is going, or what could be done with it. In a society that routinely violates people's privacy and safety with bad data handling, some people are ceasing to care because nothing keeps them safe, and some people are defending their boundaries more aggressively in hopes of keeping something safe. None of this is good. All of it means that a lot of tech which would be great in theory is more trouble than it's worth in practice.
(no subject)
Date: 2019-08-07 07:14 pm (UTC)You're welcome!
Date: 2019-08-07 08:30 pm (UTC)(no subject)
Date: 2019-08-09 11:07 am (UTC)