Most of us have probably heard that a high-ranking official with the Department of Homeland Security was recently arrested for soliciting sex with a 14 year old.
Shock. Rage. Depression.
Turns out the youngster was a cyber-impostor, in reality the official was fooled by a cop (good thing it wasn’t a terrorist, huh?).
Over at The Smoking Gun, comes a tale about similar disgustingness going on at the highest levels of NASA. I happened to be drinking coffee from a NASA mug when I read THE STORY, so it was of more than passing interest. According to the article:
"On Wednesday morning, federal investigators seized a laptop computer, a hard drive, CDs, and other material from the office of James R. Robinson, who was present when agents with NASA’s inspector general executed a search warrant at his E Street office. According to an affidavit filed in U.S. District Court, Robinson, 42, used his office computer (and another in his Virginia home) to trade and examine illegal images and videos."
So, I thought, the proclivity to want to look at nasty pictures of children doesn’t discriminate, but can be found at all levels of society, all occupations, and in people of great or little intelligence. Once again, the perp was caught by a cyber cop(s) posing as a youngster. As has been said before, sin makes people stupid.
But what really caught my eye was this bit:
"In December, after being contacted by postal agents, NASA’s inspector general opened its own probe of Robinson, which included a review of reports from the space agency’s "web activity monitoring application." The NASA system, dubbed Web ContExt, is apparently a state-of-the-art application that used a "skin tone filtering system" to determine that Robinson was viewing child porn from his office computer, most recently in January, according to the affidavit."
So, NASA has some new, ultra-kewl technology that somehow scans the content of web images and indicates how much of the total is made up of "skin tones". I assume that over a certain threshhold, some sort of red flag would pop up.
I don’t know how widespread this technology is, but it wouldn’t be too surprising to find out that some of the larger corporations were using it. If they aren’t, they soon will be.
This brings up the old debate about public good vs. invasion of privacy. On the one hand, you might catch a bunch of child victimizing cyber-pervs, and on the other, you might have blackmail, extortion and the errant prosecution of innocent people.
One way or the other we will have to come to terms with this kind of technology.
+J.M.J+
Wow, if they’ve finally invented software that can accurately discern a pornographic image like that, it would be a big boon for filtering software manufacturers! Though I wonder how long it would take the pornographers to find a way around it? Also, would it be as effective on all “races” or just for Caucasian skintones? Still, it would be a great advance in filtering technology.
In Jesu et Maria,
I recently read that MySpace.com is now using similar technology in order to filter out porn.
Yes, it will be very interesting to see both the benefits and the problems associated with this technology.
I’m not sure if attraction to adolescents is considered to be sinful by the Church. As far as I know, there have been a fair number of Catholic marriages in which the groom was over 18 and the bride was under 18, likely in the middle ages, and there may be some of those going on now in Latin America. Mexico may have different laws that vary from state to state, so it would be interesting to see if those marriages, assuming they have taken place, are considered to be valid Catholic marriages. Obviously, this whole internet seduction thing is completeley different and illegal, but still, it kind of makes me wonder what the details of that issue are.
Ram Rod, no attraction is sinful, although it may or may not be a temptation. A sin is wrongful action in the face of temptation.
“A sin is wrongful action in the face of temptation.”
But if there are these under 18 marriages still going on, they would count as “actions”. And the question remains as to whether they are considered sinful/invalid/null or whatever.
The canonical age of consent for marriage under the CIC 1083 is sixteen for a man and fourteen for a woman, though there are several mitigating factors, namely the rights of the episcopal conference to raise the age of consent within it jurisdiction (1083.2); the requirements of local law for age of consent (1057); the accepted practice of the region (1072). I may well be misreading 1057 and it instead refers to the canonical age of consent define later.
I think what Rod is getting at is that, still according to current Canon Law, the global minimum age for marriage is: “men” can marry at age 16, and “women” at age 14 (Can. 1083.1), although the local episcopal conference is free to set higher limits (Can. 1083.1). Generally, these local higher limits correspond to local civil laws.
But I think that Rod misses the point that this is about marriage, a lifetime, sacramental relationship between one man and one woman, which may, in certain cultural contexts, start at a younger age than in other cultural contexts.
Illegal, immoral abuse of young people, or even older people, the cyberporn industry, from origin to “consumption,” is an entirely different discussion from social and ecclesial norms related to minimal age for marriage.
“Illegal, immoral abuse of young people, or even older people, the cyberporn industry, from origin to “consumption,” is an entirely different discussion from social and ecclesial norms related to minimal age for marriage.”
Right. I was sort of changing the topic. It still is related in the sense that the immoral abuse of young people is generally considered to be worse than immoral abuse of older people because they are considered to be children in many ways and these secular considerations may have some parallel in Catholic teachings concerning the general time in their lives that persons first become moral/conjugal/sexual agents in the same sense that adults are.
“This brings up the old debate about public good vs. invasion of privacy. On the one hand, you might catch a bunch of child victimizing cyber-pervs, and on the other, you might have blackmail, extortion and the errant prosecution of innocent people.”
If you’re innocent, how can you be blackmailed? Why would I pay someone to keep quiet that I looked at images with lots of skin tones if there was nothing shameful in those images? It sounds like the technology is just a filter that sets a red flag, saying “look at this.” Besides, workplaces that monitor employee web use tell the employees they’re being monitored, so there’s no expectation of privacy.
Joy-
My point was that, as bad a porn-surfing might be, blackmail is just as bad (or worse). Even assuming guilt, there would need to be some kind of system to assure that this information was used properly.
Would you think less of a man that viewed porn at his workstation, or a man that blackmailed co-workers or underlings with this information? I know which man I’d rather bump into in a dark alley.
Porn viewing is one thing (its shocking and sad how common it is), and child porn another. To exist at all, child porn requires victimization. It is – by definition – visual rape, as opposed to visual fornication or adultery.
I agree that there should be no expectation of privacy on a company computer. I just don’t think that in battling one social ill (porn) we should enable another (blackmail, etc…).
A) The real problem with child porn has less to do with looking at naked young people (there are some mature looking 14 year olds and some young looking 25 year olds out there, so you can’t tell), than it does with the fact that most child porn that someone gets caught with is usually violent. Like the whole problem of foster parents being foster parents so they can rape and take pictures of the children. Child porn is rarely simply pictures of young girls. Even if it is, it very usually degrades into violent porn.
B) The government has constantly been pushing (since Clinton especially) to put more and more of the responsibilities on businesses for these types of things. Nearly every business monitors their user’s e-mail and web traffic at least a little bit, if not keep serious logs. And ISP’s are supposed to keep track of certain things that allow the FBI to show up and say, “Who was logged in at this time from this IP address”, etc…
“If you’re innocent, how can you be blackmailed?”
Fake or misleading data, for one.
Tim J says:
“Porn viewing is one thing (its shocking and sad how common it is), and child porn another. To exist at all, child porn requires victimization. It is – by definition – visual rape, as opposed to visual fornication or adultery.”
You’re too easy on legal pornography. I wouldn’t be too surprised if much of the non-child porn business relies upon human trafficking and enslavement of vulnerable women. And let’s not forget that many people go into kiddie porn because they’ve become numbed to even the titilation provided by “regular” porn.
+J.M.J+
>>>”If you’re innocent, how can you be blackmailed?
I suppose it might be possible for one employee to view porn on a computer which another employee logged onto, if the other employee is away from his desk, etc. If the true perpetrator is not caught, the innocent employee might be blamed. I haven’t been in the work force for eight years, though, so maybe I don’t know how that kind of thing works nowadays….
In Jesu et Maria,
“You’re too easy on legal pornography.”
Granted, I was presenting a “best case” scenario. Its immoral in any case, but in at least some cases it can be truthfully said that neither the viewer nor the viewee is more a “victim” than the other. It can be simply a case of two people trading off the spiritual diseases of one another.
Child porn is qualitatively different (regardless of the presence or lack of explicit violence). The child depicted in a pornographic way can’t be anything but a victim of wicked adults. One can not view child porn without participating in this process.
I will say that in all likelihood, even viewing “regular” porn encourages and supports the child porn industry, in the end. It is all of a piece.
I don’t doubt that a lot of the women involved in porn are doing so against their will.
Ok, so I like to make web pages using tan, peach and yellow backgrounds. Or heck, maybe they’re “innocent” pictures of… whatever the heck. Could put me on a list to be watched by my employer, if they see that 80% of my time at work is spent with these “flesh tones.” But really, if the software is that sophistocated, it should be sophistocated enough to cache the web addresses in question. So if I do send up little red flags, they can look to see if there is real concern, or if the program is sending up false “positives.” I mean, in that sense, there are things that can resolve possible invasion/persocution problems.
NOBODY likes to feel like their employer is watching over their shoulders, literaly or electronically. It can create alot of tension in the work place. Trust me, I’ve worked there. Employers also have a right to demand a fair day’s work from their employees. Spending 4.5 hrs a day on fantacy baseball or lookin at nekkid ladies is not really doing that either. Sure it’ll always be a struggle as to how much supervision is too much supervision, but it’s a legitimate reality.
Tammy, the way I believe the technology works is that pages with a high degree of flesh tones are flagged to be judged by human operators for pornographic content.
The problem with filters is computers cannot understand the content of the pictures — only that there are pictures on the site, how many, placement, etc. This is why even if you use filtering programs or filtering ISPs (mine is Catholic), you eventually need a human intellect somewhere along the filtering process that update “off-limit” sites on a regular basis.
Sure, you can block all sites that contain pictures and use dirty words, but you still have a problem with completely neutral words that, taken in a certain context, can lead to some pretty creepy stuff.
On the other hand, my filters make it really hard for sometimes to access Catholic sites that talk about certain topics and will prevent me even from downloading radio content from CA about, for example, same-sex attraction.
And not only are the sites you visit in a cache, but every email you write is potentially subject to scrutiny without your knowledge.
———————-
As far as this particular case is concerned, you have to be more on the lookout for this at higher echelons of government because blackmail is probably the most common way to get someone to spy for you.
Ideological affiliation does not really work these days because no one believes in anything anymore.
But it is scary how the personal weakness and proclivity of one man could bring our entire country down in flames…
So be good!
————–
As far as victimization is concerned, even consenting adult women are being victimized when they pose for these sites. In fact, every picture taken victimizes all women everywhere.
If I leave my desktop unlocked, then I’ve already committed a security violation. Someone looking at porn on my account at that point is the least of my worries.
I still don’t see how this technology could hurt anyone who was doing their work like they’re supposed to in the first place.
To me, it’s just another tool for security folks to use to prevent serious misuse of company (or government) property.
As far as privacy issues in general…I just don’t worry about it. If you’re so worried about privacy, then what do you have to hide?
Joy-
That’s true you would have already violated security, but so what? Is that as severe of a problem if someone uses your account for something else? People could be fired for surfing for inappropriate material, but rarely would people be fired for forgetting security once.
The other issue that people should be concerned about is how do you prove that you either didn’t do it, or it was an accident/innocent mistake/innocent picture. Anything where you use automation to determine wether or not something is wrong leads to issues of this sort.
The best thing that could be done with this technology is to use the technology to mark something for the sysadmin to flag and then monitor by hand. The way in which it should be used is to mark when a person is looking at something inappropriate, look at the websites, manually, to make sure that they aren’t innocent, then , while the person is looking at said images go and physically check that it is the person that would be accused and is supposed to be there and ask for an explanation.
People, rightfully, worry that this is not the way is which it would be used and you could easily find yourself in a lot of hot water, very quickly.
I know that this technology has been in use for at least a year, because I sent a picture of my then-6-month-old daughter in a cute little outfit to some fo my friends from NY. One of them wrote back to me saying that they never got the email with the pictures, because their porn filters at work cut it out for having too high a % of flesh tones in it. (I was in a t-shirt, holding my little girl in a onesie, and the focus of the picture was her and my arm.)
So, yes, there are “false positives” and yes, this technology is out there and has been used for a while, and not just by NASA or other gov’t agencies.
web pages using tan, peach and yellow backgrounds.
It occurs to me that skin has natural variations in it. Of shadows, at least, since it’s uneven. So it may filter those out.
I installed that kind of software on my home computer, and thought I had finally made the Internet a safe place for my kids. Then I caught my son looking at PlayOrc.
/just kidding 😛