Help
myBYUH

Web Content: Words of Wisdom


 

Words of Wisdom

Thomas S. MonsonPornography, the Deadly Carrier,” Pres. Thomas S. Monson

Ensign, July 2001, 2-5.

 

As we encounter that evil carrier, the pornography beetle, let our battle standard and that of our communities be taken from that famous ensign of early America, “Don’t tread on me.” Let us join in the fervent declaration of Joshua: “Choose you this day whom ye will serve; … but as for me and my house, we will serve the Lord.” Let our hearts be pure. Let our lives be clean.

 

Gordon B. HinckleyA Tragic Evil Among Us” President Gordon B. Hinckley

October 2004 General Conference

 

I repeat, we can do better than this. We must do better than this. We are men of the priesthood. This is a most sacred and marvelous gift, worth more than all the dross of the world. But it will be amen to the effectiveness of that priesthood for anyone who engages in the practice of seeking out pornographic material.

 

 

Dallin H. Oaks  “Pornography,” Elder Dallin H. Oaks

April 2005 General Conference

Pornography impairs one’s ability to enjoy a normal emotional, romantic, and spiritual relationship with a person of the opposite sex. It erodes the moral barriers that stand against inappropriate, abnormal, or illegal behavior. As conscience is desensitized, patrons of pornography are led to act out what they have witnessed, regardless of its effects on their life and the lives of others.

L. Whitney Clayton “Blessed Are All the Pure in Heart” Elder L. Whitney Clayton

October 2007 General Conference

 

Along with losing the Spirit, pornography users also lose perspective and proportion. Like King David, they try to conceal their sin, forgetting that nothing is hidden from the Lord (see 2 Nephi 27:27). Real consequences start to accumulate as self-respect ebbs away, sweet relationships sour, marriages wither, and innocent victims begin to pile up. Finding that what they have been viewing no longer satisfies, they experiment with more extreme images. They slowly grow addicted even if they don’t know it or they deny it, and like David’s, their behavior deteriorates as their moral standards disintegrate.

 

 

External & Internal Filters

Network-Level Filtering

The challenge of content filtering for BYU is complex. The University provides access to the Internet via wired and wireless connections to faculty, staff and students when they are on campus. The University also provides off-campus network connectivity for many employees who travel or have a work-related need for Internet access at home. Additionally, many employees have University-provided cell phones that include data plans which allow access to the Internet. The current strategy of network-level filtering provides protection for employees and students only when they are on campus and only when they are accessing the Internet through the University’s network. When employees or students leave campus or use third-party connections to the Internet while on campus, the University’s network-level filtering has no efficacy. Moreover, the network-level filtering technology currently in place is a blunt instrument. Entire sites can be blocked, but growing number of media-rich sites have mixed content, i.e. the good is comingled with the bad. In these cases, the good is blocked along with the bad and access is limited to otherwise beneficial content.

Device-Level Filtering

Device-level filtering is an additional layer of technology available to filter content beyond the network-level. This software is aimed at filtering the kinds of content that can be viewed on a particular computer. There are several different providers of such software and each has its strengths and weaknesses. Some of this software functions much like network-level filtering, blocking or “white-listing” entire sites. Other software is more sophisticated, filtering or blocking content selectively within a site or even on a page. Some filtering software even allows administrators to limit the amount of time or time windows during which the Internet can be accessed. In almost every case, administrators have the ability to review users’ browsing histories. While many of these capabilities are now built into the latest versions of the Mac and Windows operating systems, software provided by third-party vendors is generally more robust, flexible and feature-rich.

Much like network-level filtering, device-level filtering is not without its drawbacks. Each device-level filtering application is based on a particular set of technologies and procedures to identify and filter content. Some are technology-based, relying on key words, computer image scans, content ratings, etc. Others are human-based, relying on the individual judgment of filtering service providers to decide whether or not content is appropriate. Invariably, content that most people would deem appropriate is over-blocked by these tools. Tools that allow administrators to create “white lists” of approved sites result in the most over-blocking. While white listing might be an effective approach in some cases, it is too restrictive for most purposes (e.g. conducting research using Google to search across millions of web sites). Most filtering tools provide a mechanism for overriding blocks or for requesting a re-categorization of blocked material. However, this can become a burden for an administrator who is required to review overrides and override requests. Such inconveniences notwithstanding, the added sense of accountability this approach affords might be worth the investment.

While device-level filtering, depending on how it is configured, can be too restrictive, there is a more serious challenge to consider. Even more troubling than over-blocking is problem of under-blocking. With the exception of white-listing tools (which allow access only to specifically approved sites and nothing else), no filtering technique can consistently and effectively block all inappropriate material. The content sources and technologies on the World Wide Web are too complex and varied and ever changing. No content filter is infallible. Invariably, some content will be under-blocked. Even with device-level filtering in place, users run the risk of being exposed to inappropriate web sites and web content.

Internal Filters

While external barriers and filters are essential tools in the fight against evil, it is important to recognize the limitations of such external protections. It is helpful to think of such tools as guardrails along the edge of a high mountain road. Guardrails are put in place to keep drivers safe. But drivers have to do their part—they have to obey the speed limit, stay in their lanes, etc. If they do not, no number of guardrails will be sufficient to prevent them from going over the edge. Certainly one who is intent on going over will find a way to do so, guardrail or not. And guardrails cannot be erected everywhere—there are bound to be gaps in the protection they provide. So it is with the external media filters we put in place. We can create a relatively safe environment in our apartments, dorm rooms, homes or offices. But the filters we put in place in those environments will not be present everywhere we go--Internet connections are unlikely to be filtered in all of our friends' homes, in hotel rooms we stay in, at public libraries, at Internet cafes and other locations.

While we should implement the most effective external filters we can identify, we are likely to find it increasingly difficult to implement and maintain failsafe external protections to address what is essentially an internal moral challenge. For example, as technology continues to evolve, modestly priced, pervasive access to the Internet from cell phones and other mobile devices is becoming commonplace. The providers of these inexpensive and high performance services do not always share our views of what is appropriate and what is not. And even the best Internet filters are not foolproof. There are bound to be gaps in the “guardrails” we put in place.

One of the most significant challenges we face comes when good content is mixed up with inappropriate content on the same website. Sites like YouTube, Flickr, Picasa, and Google Video allow users to submit their own content. Much of the content on these sites is "virtuous, lovely, or of good report or praiseworthy." But much of it is not. While some device level filtering software is designed to selectively allow or block content on such sites, we are primarily on our own to decide what we will view and what we will not. Elder David A. Bednar has reminded us that we have inside of us the most effectively, accurate, fail-safe filter in the world--the Gift of the Holy Ghost. He offers two specific questions we can ask ourselves as we decide what is appropriate and what is not:

  1. Does the use of various technologies and media invite or impede the constant companionship of the Holy Ghost in your life?
  2. Does the time you spend using various technologies and media enlarge or restrict your capacity to live, to love, and to serve in meaningful ways?

As we consistently ask ourselves these questions, Elder Bednar assures us that we "will receive answers, inspiration, and instruction from the Holy Ghost suited to [our] individual circumstances and needs."