Size / / /

"Justin Abling, what's eating you?" said the note from Alexi, her handwriting as bold and forthright as she was. "Loosen up, buster. Leave your jet lag or whatever the hell behind."

The prototype of the Truth Machine we used in the office printed out slips of yellow paper like this one, with a number series followed by a gloss at the bottom. I thought I could smell the ink, even above the dank, fume-laden air of the tunnels.

I had fallen into a controlling mind-set, the numbers said. Manipulative. Maybe I even lied, when I spoke to her. Maybe?

The Truth Machine said I had, and that was that.

I wanted to crumple the paper and leave it on the subway with the gum wrappers and magazine inserts. Some wise old rat would come along, then, and sniff the oils my fingers left on the wrinkled paper, and next morning the gray thing would be perched on my stoop, the wad in its teeth, its nose pointed upwards like an accusing finger. "I found this," it would hiss. "It's yours."

Maybe I had worked too hard, the last few months.

Whittled down to just a company man?

Right. The salesman for the Truth Institute tells a fib.

The wadded paper went into my pocket as I took the stairs to street level. There, a row of restaurants shined their signs toward me: Armenian, Japanese, Thai, American Sub, American Burger, American Pizza, Irish Pub.

Manipulative, the note said.

"I am looking for an honest man," the robot said.

I jumped. People walked to each side of me, oblivious: they had seen it a hundred times. The Diogenes Robot, six feet tall, stood on the sidewalk, one arm upraised, the other stiffly at its side. The upraised claw held a lantern. Nothing shined inside the hollow cage of the lantern. No bulb, no flame.

The robot's face:

Round, glass-shielded eyes. Raised areas in the forehead for brows. Another raised area for a nose. Empty slit of a mouth.

A tin can, pretending to be a head, perched atop a bigger tin can pretending to be a body, with twin columns of smaller tin cans pretending to be legs underneath.

It was nothing more than that.

When I stepped aside, it went on its tin-can way.

I frowned, then smiled as well as I could, remembering the hundreds of broad-spectrum eyes dotting the fronts and backs of these robots.

Eyes, eyes, everywhere.


illustration

Little Jessica caught on to the toy first thing. Her black-coffee curls flew as she scattered the shapes through the rooms of Jenny's apartment. In the living room, she activated the green one shaped like a pyramid. It jiggled, turned, and unfolded to become a chicken. It stood on pencil-lead legs, clucked, and repeated in a scratchy, amusing voice the word Jessica spoke to it: "Boogers," she said. "Boogers," said the green chicken. Then it danced in place with wings flapping, flashing its eyes a few times before folding itself into a pyramid again.

Jessica ran into the kitchen, where a ball-shaped piece sat atop the counter. She tapped it. It unfolded into a turtle of blue plastic, with the same flashing eyes.

The tiny beak opened:

"Boogers," it said.

She laughed.

"You only need to teach one piece," I said. "But all the pieces know, once you teach that one."

Jessica looked happily at the turtle.

"People will be the same way, someday," I said. "If one child learns something in a class, there will be ways to have all the other children learn the same thing, too, without their ever being told."

"Must you make a message out of a toy," said Jenny.

Her hair had gone blonde in the year since the end of our marriage, in the same way her face had gone hard.

"Really, the toy is a message," I said. "They're hot in Japan right now, you know. They'll be big in this country soon. They prefigure bigger things. For society, I mean."

"It's just a toy," she said. "Isn't it, Jessica?"

"Jessica," I said to my daughter, "why don't you ask mommy if she wants to have supper tonight with daddy?"

I had to try.

"Jessica's mum has a dinner date with a very nice man," said Jenny, in Jessica's direction. Jenny was doing well at sounding amused and cheerful. "Jessica is going with us, remember? And daddy knows better than to ask such a silly question, doesn't he, Jessica?"


I had gone in to see Alexi first thing that morning.

"The purpose of this promotional event you envision?" she said. "Or should I call it an outright attack?"

A big-boned woman, Alexi Thomas projected gravity and solidity when she leaned forward, elbows on the blotter, her fingers intertwining. I had known days when she let emotions shape the flatness of her face. Not this one. Her large, black eyes pinned me up where I stood in my long, gray coat, in front of her desk. So I stayed standing.

Heavy-boned myself, and thickly built, I suddenly imagined myself a henchman, drumming up a job. My appearance threw people off, at times. They expected to hear nothing like the history of productivity-monitoring technology from my lips: one of my favorite topics, though.

"These robots are huge in the news," I said.

"You have no idea. But you've been overseas."

"If we highlighted the differences between the Truth Machines and the Diogenes Robots, it would help raise awareness of the Institute."

"You've looked at the file of releases we already put out, this past year?"

"Sure."

She frowned, eyes still pinning me. "Take a more careful look. Then tell me why you want to do more. We did a hell of a lot in that direction."

I leaned forward, fists on her desk. "I went through, and I saw enough. None of them say the Diogenes Robots are giving people the wrong idea about the nature of the coming machine intelligence. It doesn't help to have old stereotypes reinforced this way. These Diogenes Robots look like toys. They're straight out of nineteen-fifties movies, with gray metal faces and steel bodies. They're walking garbage cans."

"You're telling me nothing as to why we should go up against something with such a positive public image."

"It's a fraud, Alexi."

"How can a symbolic gesture be a fraud?" she said. "Even so elaborate and expensive a one as this?"

"Give me a chance. I think I can swing a meeting with David Wilkening."

"If you can get a broadcast out of it, say, of a candid talk with him about the coming machine-human interface, then, sure, that would work out fine. A comparison of attitudes. Future of the human-machine mind. Can you set that up?"

It sounded fine. Not what I planned to do, though.

"Of course," I said.


illustration

"I am looking for an honest man," Diogenes Robots said to men. "For an honest woman," to women. "For an honest person," at random, and to people of uncertain gender.

This was what the robots did. They walked around, asking the question. Hour after hour. Day after day, and night after night. While the large facial eyes spotted subjects for the question, the myriad ocelli around their bodies gathered data to send to libraries scattered around the country. Security, supposedly: anyone trying to damage a robot would be seen in the act. Many had tried. All had been caught.

The scene flashed into my mind of my last moments at Jenny's apartment:

The toy in the corner. Something new.

"What's that," I said, even though I already knew.

"Oh, Daddy, it's so cute. I'll show you," said Jessica, running to the plastic robot. She pushed a button on its back and then danced away, laughing happily. The robot, its head at her knees, walked toward me. It looked much like the ones on the streets: a silver tin can. It lifted its arm as it neared. A light glowed in the lantern cage, and in its eyes.

"I am looking for an honest child," it said.

"Own Your Own Diogenes Robot!" said the bright writing on the box tossed atop the recycling bin.

Jessica laughed and clapped. My stomach burned as I said my goodbyes.

A present from the Very Nice Man, in all likelihood.


"David Wilkening of course sunk his own wealth into the project, Justin. He's also attracted charitable funding. Diogenes Inc. is a not-for-profit. That the law-and-order community is behind it has helped, too. A few police departments even provide funding. Bringing robots into some precincts has worked better in lowering the crime rate than adding more police cruisers to those same precincts has, for instance. Instant results, in terms of crime reduction. Pretty irresistible, from their point of view. You want figures?"

"No," I said.

Rod Burke was good for facts. He tapped the rim of his cup when he made a point. Right now he pushed his glasses back up his nose, and leaned back.

"These robots have got to cost a hell of a lot more than police cruisers," I said.

"Not by so much." Tap. "There are grants and funds out there for helping law enforcement, too." Another tap. "So anyway, Justin, what's this about? You said a strategy bull session."

"I want to shake up Wilkening."

Burke's office at the back of the Institute smelled of coffee and old books. Ancient fluorescent fixtures hummed overhead.

"Why?" he said, swinging his pen back and forth.

"Number one, this is drying up our funding. Number two, it's running counter to everything we're pursuing."

"OK. Number one, it won't dry up funding. The foundations supporting us have different agendas. Your number two I don't get."

"Alexi said we've already lost one funding source, but that's not the point. The point is that the Diogenes Project reinforces the traditional social contract. It's telling people to step back from technological society and rely on themselves, and to let the subjective standard continue to be more important than any possible objective standard of social behavior. You and I know the subjective standard has never worked that well. Witness our last century."

"Granting you the Diogenes Robots reinforce the traditional social contract, I'm not sure I see why their presence gets in the way of the development of a human-machine interface, or of the objective standard you're talking about."

"The Diogenes Project reinforces independent-mode action, on the part of the individual," I said. "The robots go around looking for someone who is this ideal thing called 'honest.' This goes counter to the idea that we have a way to make everyone equally, verifiably honest. The individual cannot make himself or herself honest. Society has to create the honest individual."

"But as a sideeffect, that's what the Diogenes Robots seem to be achieving, too, in a way."

"They embody an outmoded idea. The Watching Conscience. The Eye of God."

"And that's different from the Hasserman Interface."

"Exactly. Human interaction is made up of transactions. Transactions of words and ideas. You can set Truth Values on all human transactions, by mean of the Interface. The Truth Machine establishes factual truth. It isn't fear of the Watching Eye, with the Truth Machine. It's the actuality of a watching mind, a mind that keeps tabs."

illustration

"Maybe so," said Burke. "But you're playing with fire, taking on Wilkening."

"You're the strategist here. I'm looking for approaches."

"Strategist, yes. Not blind zealot for the Interface."

"Give it some thought."

His pen tapped.

"Sure."

Tapped.


Having lunch alone at the Athens, I took a table on the roof, by the railing, beneath a flat, gray sky. I gazed down on the milling crowd below. Two young women led children in a line joined by an orange streamer. Laughter cascaded upward. Not a Diogenes Robot in sight.

I amused myself by counting and then losing count of umbrellas. It was not a raining kind of day.

I could go it alone. A morning-show interview slot had opened due to a cancellation. Wilkening had agreed to it. He owed that particular host a favor, and he recognized my name. The benefits of slight fame. Rod Burke and Alexi Thomas would come around to my point of view, in the end.

The Truth Machine Institute was developing the inevitable human-machine interface along the most rational lines, using Hasserman-style neural networks we dubbed Personal Experts. In our system, each person had her or his own Personal Expert, lodged in a cluster of chips embedded in the skull. Through a combination of brainmapping and audio-visual monitoring, the Personal Experts logged into the surrounding environmental networks, installed in the walls. Those networks had access to centrally located time-maps of both inner and outer behavior of each individual.

Why go through all the surveillance, the logging in, and the map comparison? Because the end result was a profile of each person in the system. These profiles we called Truth Templates. Each action or utterance of a person within a business could be assigned a Truth Value, based on these templates. For businesses dealing with questions of employee diligence and administrative integrity—I have yet to encounter a company not concerned with these issues—the Truth Machine Institute offered a means of monitoring diligence, and increasing integrity.

Diligence and integrity:

The world needed what we were offering.

We already had a few takers.


The afternoon turned into a line from James Thomson: The sun has never visited this city. Clouds gathered so thickly overhead the lit windows dotting the sky were like stars in the night.

An old two-rails screeched. A driver cursed out his open window. Puffs rose of mentholated smoke. Suits marched by, the color of birch leaves faded and bleached by winter. The sky let go and drizzled on my bare head after all.

I bought a water bottle from a sidewalk vendor, and spent a few moments admiring some animated lapel pins. One caught my eye: orange face of a bear, with huge eyes. Might work for a present for Jessica. The face had words in its retinas. It blinked. "Not," said one eye; "smart," said the other. It blinked again. "But," said the one; "honest," said the other. My hand stopped on its way to my change pocket.

The dark stair leading underground called, with its rumbling and drafts of cool, stale air.

I hurried away and down.

Snapshot, then, of a late-afternoon crowd, Second and Washington:

Multicolored hats, scarves, capelets, bags; grays and browns of business knits; a black and white cab in a flurry of blue and red coupes. Hurrying. White steam rising from a grate. One man yelling for a taxi, another selling shoe-shines. A glint of light reflecting off a silvery figure, stopped before a child in a red coat and red cap. Her hand touching the coattail of a blandly smiling parent.

Her eyes looking up at the robot.

Her tiny mouth opening, in a laugh.


Long ago, when she and I were almost friends, we used to meet in Kook's Korner, a bar halfway between her suburb and mine. Alexi's gruffness felt less aggressive, outside the office. I ended up marrying Jenny, though.

"How long since you've gotten stinking drunk?" she said when I met her now.

Kook's Korner looked about the same. Lots of colorful nonsense on the walls: posters, hand-scrawled rantings, rock show announcements. Highball glasses clunked on the old wooden bar.

When I looked at Alexi, I thought she looked much the same, too. Not the same as in the office. The same as she had half a decade before.

"I never get stinking drunk."

"Stinking liar."

She ordered something I had not tasted in years.

"Is this an official meeting?" I said, feeling the warmth.

"Official, hell. Off the books."

"Away from our own big staring eyes."

"Who the hell knows that, anymore? Who the hell knows what's looking at us, even in a bar?"

"Not in Kook's."

"Couldn't be," she agreed. "You have an on-air discussion with Wilkening, tomorrow morning."

"I didn't tell you that."

"Yeah, well, the Truth Institute's administering your payroll and it hears things. Big surprise. Listen, I'm going to tell you what to say, to Wilkening."

"Go to hell."

"That's already arranged. I'm just sealing my doom, telling you what to say. Because this is just me talking, this time. Just me, Alexi, drinking and chatting in a bar."

Since that made me want to, I listened. The henchman awaiting his job got it, right then. The victim was her, she said. Or else me.

Or else, just maybe, it was the Truth Institute.


illustration

Being accosted by a Diogenes Robot, Alexi said, she saw the light.

You're crazy, I said back to her.

We're the ones building the robots, Alexi said. Put systems created by the Truth Institute in every building, and everyone has to start complying, perfectly, with the protocols of each building. It's not human-machine interface. It's just more of Alexander the Great, standing over Diogenes of Sinope in his bathtub. "Get out of the light." That was all Diogenes had to say, to Alexander the Great. Get out of the light, Diogenes said. You're standing in the way of the sun. You're blocking the light. Literally, the light.

So what do you want me to say to Wilkening? I said to Alexi. Not that I'm going to say what you want me to.

Sure you are, she said, because you're going to think about it and you're going to see it's true. You're going to tell him to stop pussyfooting around and really do it right. To do his Diogenes Robots right. Strip out those millions of eyes. Tell him that. Strip out everything but the sound box. Let it walk around, looking for honesty. But let it just be the tin-can man, all hollow, holding up the dark lantern in the middle of the day. Just the tin-can man. Because if he doesn't strip out the watching eyes, then he's part of the problem. If the Diogenes Robot has a million eyes, it's an Alexander Robot, not Diogenes. It's the big empire, not the one man.

If we can't be honest when no one is watching, then Diogenes is wrong, she said. And if Diogenes is wrong, then we do need Alexander, after all. Alexander, the Great, standing in our sunlight.

I had been about to open my mouth to argue until she said that.

You're saying we at the Truth Institute are like Alexander, I said.

That's what I'm saying. And I think that's what's gotten you so upset. You were good, really good, at selling the idea of the Truth Machine to those overseas companies. But it's put you in a missionary frame of mind. And you're still there. You think you're right and no one else is.

Don't you believe in what we're doing?

I'm just doing my job, she said. As to whether I believe in what I'm doing . . . it's more like I don't want to believe. It's like I'm afraid what we're doing really, actually might be the right thing. What if the Truth Machine turns into more than a gizmo we sell to businesses? What if this sort of thing creeps into public places—and then private places? If you accept the idea that people need the objective standard-keeping system we're offering in the workplace, then how can you say people don't need that same system in every public place—and then in the home?

We both stared at our drinks. Alcohol—that was enough, once, to get people to open up to each other. To let things out. To be honest, friend to friend.

You're worked up because you see Wilkening as a competitor, she said. And maybe he is a competitor. Because he's introduced a different kind of watcher than the one we're introducing. And what the Truth Institute really needs, instead of more of the same thing, is exactly the opposite thing. It needs the real Diogenes.

You're crazy, I said.

Maybe I am, she said. But I think we need the real Diogenes. Because I hope we're wrong. I hope the Truth Institute is wrong. Dead wrong. I hope society doesn't need this. Sure, I want us to sell a few systems and make a living for a while. But deep down I hope people can find honesty on their own, without the Truth Machine. Otherwise . . .

She let the word hang there:

Otherwise.

Otherwise, I filled in for her, we really do need Alexander the Great looming over us, standing in the light. We need Big Society to set the rules and then enforce them on the smallest level. Everywhere. All the time.

Exactly as I had been promoting, overseas.

Promoting. Or preaching, maybe.

So, I said to her, you want there to be real Diogenes Robots, not watchers? Real Diogenes Robots—because you hope we're wrong?

Don't admit the doubt, she said to me. Stand true for the Institute. But challenge him. Challenge him to do it right, and not make any more fake Diogenes. Make them the real thing. Do your machine-interface shtick, if you want, Justin. But challenge the guy to do Diogenes right.

I turned it over in my mind, riding home at midnight.

I emerged from the station near my apartment. With the roaring rails behind me, I reeled beneath the million lights of the city above staring down. I jammed my hands into my pockets, breathing in deeply and looking up.

How could I do it? I, Justin Abling, who had been selling the fear of dishonesty halfway around the world . . .

How could I be the one to step forward to the man who had set these truth-seeking robots upon the world—

Step forward, to hold the darkened lantern to his face?

I saw a cement-encased public trash can, there beside the street. My fingers closed around a wad of paper that was jumbled in with the keys in my pocket. The street, for the moment, had no cars, no pedestrians. No Diogenes Robots, looking for an honest man.

"I'm looking for one, too. Hope I'll find me, someday," I said, tossing the wad of paper in the trash can and heading home.

 

illustration

Illustration copyright 2005 by Avijit Das


With partner Martha Borchardt he plays in two central-Wisconsin rock and folk-rock bands, Mad Melancholy Monkey Mind and Keg Salad.



Bio to come.
Bio to come.
Current Issue
25 Mar 2024

Looking back, I see that my initial hope for this episode was that the mud would have a heartbeat and a heart that has teeth and crippling anxiety. Some of that hope has become a reality, but at what cost?
to work under the / moon is to build a formidable tomorrow
Significantly, neither the humans nor the tigers are shown to possess an original or authoritative version of the narrative, and it is only in such collaborative and dialogic encounters that human-animal relations and entanglements can be dis-entangled.
By: Sammy Lê
Art by: Kim Hu
the train ascends a bridge over endless rows of houses made of beams from decommissioned factories, stripped hulls, salvaged engines—
Issue 18 Mar 2024
Strange Horizons
Issue 11 Mar 2024
Issue 4 Mar 2024
Issue 26 Feb 2024
Issue 19 Feb 2024
Issue 12 Feb 2024
Issue 5 Feb 2024
Issue 29 Jan 2024
Issue 15 Jan 2024
Issue 8 Jan 2024
Load More
%d bloggers like this: