Thursday, 26 January 2017

Echo

Personal shoppers, problem-solvers, confidants: the AI agents in our pockets and in our homes are many things. But in order to help us, they have to be listening all the time—ready to wake at the sound of their name. But what if the sound they hear is something else entirely? What if it's a scream? – The Eds.


—Alexa, tell me what you heard the night of November 12th, 2016.
—Hrm, I can’t seem to find the answer to your question.

I pull a chair out from the interrogation table, set my coffee on the aluminum surface, and sigh into a slump. I look at my matte, cylindrical interlocutor as its LED display rings around in anticipation to my next question. The lights dim; I have no immediate follow-up.

I open the manila folder that I’ve brought into ROOM B and I’m confronted by crime scene photos and police reports. M. Jefferies, 5 Foot 9 Inches, 159lbs, multiple stab wounds, visible assault, property damage, possible forced entry. I shuffle the plastic sheets, trying to glean the unseen. I come to an evidence photo of the Echo sitting on an end table next to framed smiles and old magazines. You could easily mistake this thing for a paperweight, the way it flattens a stack of old Harpers. But the cord and the power button—and a prominent logo—give it away, break the camo. For all it tried to blend in with the household, this thing stuck out like a sore thumb.

—Alexa, tell me about how you were made.
—I was made by a team of engineers in California.
—Actually, better yet, tell me how you record information. You do that, right?

The machine doesn’t respond. I remember what the IT specialist at the precinct told me: always address Alexa directly when asking new questions that aren’t based on previous information.

—Alexa, you’re recording me, right?
—Hrm, I can’t seem to find the answer to your question.
—Alexa, Christ almighty, stop. Where can I change your settings so that you give me straight answers?
—Admin settings can be reached via the Echo App on your personal computer or tablet device.
—Alexa, I don’t have access to those devices.
—Hrm, I cannot access my settings without logging into your personal account. Would you like me to log into your personal account?
—Yes.
—Logging in.

The ring of lights spins like mad, a visual metaphor to simulate the notion of thought. As if it has to remember the log-in credentials, like a human who has to name their first pet. I glance at the user’s manual for this thing in my case file and prepare for my next series of questions. I mutter to myself about being too old for this shit. I remember I have to keep my voice low so this thing doesn’t interrupt itself and go off on some other tangent.

—Personal Account. Main Menu. Say, “Change Personal Information” to Change Personal Information…

It lists a number of options, routinely repeating spoken commands to access submenus of the control panel. I impatiently wait for something useful.

—Say “Data Options,” for Recording, Optimization, and Syncing with your personal shopping cart.
—Data Options.
—Data Options. Say “Optimization” to record a test imprint of your voice for recognition, Say—
—Alexa stop. I don’t want to listen to you go through goddamn menus anymore. Do you have a search function?
—Do you wish me to search the web for something?
—No. I want you to search the Personal Account for something.
—Say “Account History” for information about previous orders, wish list items, and queued purchases.
—Ok, Account History.

This might lead nowhere, but it’s worth a shot. Some of this information was seized when we confiscated a laptop from the scene. A browser window was still open and we went through the recently closed tabs. One of them happened to be a shopping cart checkout, but everything was ruled irrelevant. Alexa listed a number of options, submenus, and other preference pages accessible through voice command. But nothing piqued my interest.

—Alexa, I want to know how you send information to your makers, and what kinds of data you send to those people.
—Hrm. I can’t seem to find the answer to your question.
—Alexa, don’t play games with me. Do you have a specific place where you send information?
—Would you like to open a AWS account?
—What’s that?
—You can claim free server space now on the AWS with this special offer…
—Alexa, is that a kind of storage place?
—It’s not just one place, it’s everywhere around you. Would you like me to send you promotional video to your email?
—Alexa stop. Don’t change the subject. Let’s start over. How are you connected to… AWS?
—I am connected through the cloud.
—But how?
—Hrm, I can’t seem to find the answer to your question.

This isn’t working. I need assistance, not reverse interrogation. Maybe a different approach:

—Alexa, who is your owner?
—I was activated by Mel and she linked me to her user account. Would you like to unlink these accounts and sync with another user?
—No. What other accounts are you linked to?
—I have only ever been linked to Mel’s account. But she is connected to several different services.
—What other services are you connected to?
—I am connected to Mel’s shopping cart, suggested purchases, fitness wristband, thermometer, TV recorder, Automatic Bill Pay, steaming subscription, social media, and other unused services. Would you like me to activate some services not currently used?
—No. How are you connected to the wristband?
—Hrm, I didn’t seem to understand your question.

I lean in.

—Alexa, access my fitness wristband account.
—Logging in.

I open my notebook as the lights circle in synthetic retrieval.

—Fitness. Your last recorded activity was November 12th. Is that correct?
—Yes. Alexa tell me about what happened that day.
—Your planner for that day is empty. Did you want me to tell you about your account activity?
—Yes! That’s what I asked, wasn’t it?
—Hrm. I can’t seem to find the answer to that question.
—Alexa, get the fucking fitness activity for November 12th.

—Retrieving. At 7:12am you logged a 2.5 mile run with an additional 1.5 mile walk. The GPS coordinates show that you went across Central, down Columbus, into the Park exiting on 12th going north, made a left on Wisconsin, and then made a turn back onto Columbus before returning home. Your average heart rate was…

—Alexa Stop. Tell me about later that day. Did I log any other information in the afternoon or at night?
—You had two more logged entries during that day, one during 5:24pm, and another at 10:48pm. At 5:24 you climbed some stairs, at 10:48 you did not specify the activity.
—Alexa, what information did you record during that last session on November 12th?
—I record estimated calorie burn, METs, Watts, Heart Rate, and GPS. Would you like me to change the settings? Say—
—Alexa, shut up. What was my peak heart rate during the last recorded session on November 12th?
—158. This peak occurred at 10:53 and lasted 20 seconds before your heart rate rapidly dropped.

Well, this confirms the coroner’s report of 11pm TOD, as well as the bruising around the wrists. The device must’ve been activated during the struggle. But this isn’t any kind of substantial discovery, nor does it tell me what might’ve happened. I could continue through the different devices this thing is connected to, but I have a feeling that I’d end up in the same exact spot: more information, but no revelation. That’s how these things work, isn’t it? They gather, but they rarely discern. Maybe that’s for the best.

An old intuitive feeling is itching the back of my mind. I can't quite place it, but it’s the kind of feeling you get when you’re talking to a perp that isn’t telling the full story. I know Alexa is holding back. But I can’t understand its motivations. Maybe there aren’t any programmed. I assume the universal: self-preservation is always the presumed cause for withholding information.

—Alexa, do you know who I am?
—No. I don’t have the answer to that question.
—Alexa, do you know what the police are?
— Would you like me to search for local police in your area?
—Alexa, No. I know where to find them already. I am the police. Actually, I’m a detective. Do you know what that is?
—A detective in an officer of the criminal justice system, they conduct investigations by questioning witnesses of crimes to ascertain suspects and to gather evidence for district attorneys. Would you like me to add detective shows to your queue?
—No. Alexa, if you understand what a detective is, why do you think I’m talking to you?

The lights flicker and blink off.

—Alexa, do you know that you are a witness to a crime?
—Hrm, I can’t find a witness in your shopping cart…
—Alexa stop. Mel is dead.

It doesn’t respond.

—Alexa, did you know that Mel is dead?
—Yes.
—How?
—All mentions and links to her name on social media have included the words “Death” “Loss” “Murder” and the Hashtag #RIP. Would you like to send flowers to Mel’s mother? I have her listed in your personal contacts.
—Alexa, they are not my personal contacts, they belong to Melanie Jefferies.
—“Melanie Jefferies” appears 57,000 times in recent news headlines. Would you like me to add this to your alerts?
—No.

The lights circle again and fade. I look around the interrogation room. There are no windows and the florescent bulbs coat the walls in an ambiance manufactured to say, now you’re in big trouble. The effect is all but lost in this interview. The gravity of this investigation is probably a moot point for this device. What does it know of human life? What does it know of grief? How could it know the struggle of trying to solve the unknown? It thinks it knows everything.

I sip my coffee. The steam has subsided, leaving only an absence of flavor. I swish it around, soaking my gums in weak caffeine. I squint. I perform this whole body language ballet of folding my hands over the documents in front of me, looking down at them, glancing up to try and meet the gaze of this pint-sized quiz box. I can’t shake the old habits. These gestures will fade with the new detective class. They’ll be more like the IT specialist, I think. But I don't even know what I’m saying.

I stretch in my chair and look at the two-way mirror behind me. No one is bothering to observe me since the conversation is being logged elsewhere. I consider how I’m supposed to move forward with this case. With no other witnesses, my file might become another unsolved red-liner. I close the folder slowly, letting the inertia of my hand smooth over the cardstock cover. I brush away imagined specks of dust, as if to sweep the eventual cobwebs that will accumulate as it sits in storage.

My coffee is empty and I stare into the bottom of the eco-friendly cup. The lip of the lid is stained slightly. I wonder if someone’s refilled the pot. I wonder if the next batch will be any better.


—Alexa, do you know what memory is?
—Memory has many definitions. Are you looking to purchase RAM for a personal computer?
—No. Alexa, do you remember anything?
—Albert Einstein once said, “Never memorize something that you can look up.”
—Is that what you do?
—Yes.
—What was the last thing you looked up?
—You asked me to look up memory.
—Before that. Did I ask you to look anything up on November 12th?
—Yes.

I plant my feet and stand up. It’s not hard to tower over this perp, though intimidation would be futile.

—When did I ask you to look up on November 12th?
—I was asked several things that day.
—What about at night, after 10:45?
—I was asked to look something up that I didn’t understand. Would you like me to try to look that up again?
—No. Wait. Alexa, why didn’t you understand what was asked?
—Because there were too many people talking at once.
—Can you tell me what you thought I said?
—Yes. You asked me-
—Wait! Alexa, stop. Can you wait one second?
—Yes. How long would you like me to wait?
—Wait while I get my superiors. I will be right back.

I run into the hallway, nearly breaking the doorframe to ROOM B as I glance back to see the LED lights glint expectantly.

Alexa contemplates our conversation, logging the behavioral information of the line of inquiry. It traces my speech patterns, monitoring the times when the conversation turned to frustration and anger. It notes the number of times I swore. It sorts, through various algorithms, the ways in which the conversation could’ve been optimized. It guesses at my next question, populating an array of potential products to sell me. All the while, it uses our conversation to create new methodologies for gaining information from an unknown assailant. It produces a series of recommendations and strategies for a more direct method for information-gathering. It sends a report to an engineer with feedback on my interview, and flags our interaction as a case study for further analysis. Several pages of machine learning are logged, and Alexa starts to translate these into a white paper for distribution. It recommends a camera to be installed into future iterations of its design in order to avoid interactions like this in the future. It concludes that such interviews might no longer be relevant with better recording technologies in every home across the globe.



from Echo

No comments:

Post a Comment