Nothing Special   »   [go: up one dir, main page]

skip to main content
article
Free access

Rating the major computing periodicals on readability

Published: 01 February 1985 Publication History

Abstract

The readability of the ten major computing periodicals is analyzed using the Flesch Reading Ease Index.

References

[1]
Bruce, B., Rubin, A., and Starr, K. Why readability formulas fail. IEEE Trans. Prof. Commun. {Mar. 1981}. 50-52.
[2]
Coke, E., and Rothkopf, E. Note on a simple algorithim for a computer-produced reading ease score. J. Appl. Psychol. {June 1970}, 208-210.
[3]
Dixon, W., and Brown, M., Eds. BMDP-79 Biomedical Computer Programs P-Series. Univ. of California Press, Berkeley, 1979.
[4]
Fang, I. By computer: Flesch's reading ease score and syllable counter. Behav. Sci {May 1986} 249-251.
[5]
Greenberg, H., ed. The Standard Periodical Directory. 6th ed. Oxbridge Communications, New York, 1979-1980.
[6]
Flesch, R. A new readability yardstick. J Appl. Psychol. {June 1948}, 221-233.
[7]
Flesch, R. How to Test Readability. Harper, New York, 1951.
[8]
General Motors Corporation. S.T.A.R. brochure. General Publicity, Detriot Mich.
[9]
Kirk, R. Experimental Design: Procedures for the Behavioral Sciences. Brooks/Cole Publishing Co. Monterey, Calif., 1968.
[10]
Klare, G. Assessing readability. Reading Res. Q. 1 {1974-1975}, 62-102.
[11]
Lemos, R. STAR plus transcription rules: Reliable readability index. J Eudc. Technol. Syst. 2, 3 {1982-1983}, 251-258.
[12]
Lemos, R. CDC CYBER BASIC version of STAR. Computer program available from the author, 1980.
[13]
Levine, H., Robust tests for equality of variances. In Contributions to Probability and Statistics. I. Olkin, Ed. Stanford Univ. Press, Palo Alto, Calif., 1960, 278-292.
[14]
Nie, N. SPSS: Statistical Package for the Social Sciences, 2nd edition. McGraw-Hill, New York, 1975.
[15]
Redish, J. Understanding the limitations of readability formulas. IEEE Trans. Prof. Commun. {Mar. 1981}, 46-48.
[16]
Ulrich's International Periodicals Directory. 18th ed. RR. Bowker Co., New York, 1979-1980.

Cited By

View all
  • (2004)How effectively do marketing journals transfer useful learning from scholars to practitioners?Marketing Intelligence & Planning10.1108/0263450041055192322:5(540-556)Online publication date: Aug-2004
  • (1989)The cloze procedure: assessing the understandability of an IEEE standardIEEE Transactions on Professional Communication10.1109/47.2186132:1(41-47)Online publication date: Mar-1989
  • (1989)The cloze procedure: an assessment of the understandability of data processing textsInformation and Management10.1016/0378-7206(89)90016-517:3(143-155)Online publication date: 1-Oct-1989
  • Show More Cited By

Recommendations

Reviews

Eric A. Weiss

A UTHOR R EBUTTAL I am disappointed that Mr. Weiss has missed the entire point of my recent paper on the readability of computing periodicals. While his first paragraph provides a good summary of the paper, he then raises two completely irrelevant questions. The main question is, “Are computing journals readable__?__” Many computing professionals (including myself) feel they are not. This paper gives empirical evidence on readability levels. The next question is, “What factors affect readability__?__” There is much research (cited in the article) to show that word and sentence length are valid predictors of readability levels. For example, since grammar school we have all been instructed not to use “run-on” sentences. Scholarly journals are real abusers of this basic principle of clear communication. Also, excessive use of “big words” may look impressive, but this practice does not contribute to conveying information. Would (should) we use such words and sentences in conversations__?__ Mr. Weiss criticizes Flesch's formula. Even a cursory review of the literature on readability will show Flesch to be the major figure in this area. Furthermore, other approaches would have yielded substantially the same results. Computing periodicals are not very readable. By the way, Mr. Weiss makes an erroneous statement in his third paragraph. It is an index score of 100 (not zero) that represents maximum reading ease. Also, Mr. Weiss' comments on other considerations determining readability are fully recognized in the article with reference citations to detailed critiques of readability indices. Readability indices indicate potential barriers to effective communication of ideas. I would ask that the reader review Figure 1b in the article which shows an example passage with a low readability score. Try reading it out loud to yourself or someone else. This is not a readable passage. Mr. Weiss uses terms such as “presumptuous,” “ridiculous,” “dangerous,” “wrong,” and “foolish” to refer to the approach taken in this paper to measure readability. What approach would Mr. Weiss suggest__?__ What specific guidelines can be given to writers, editors, and publishers__?__ Criticisms without specific alternatives have little value. In spite of the fears of Mr. Weiss, I believe there is a growing concern about readability. Requests for the BASIC program that I used in the study came from a very surprising cross-section of readers. In addition to US colleges and universities, I received many requests from foreign institutions. Other requests came from government (including military) organizations. Requests also came from private organizations and individuals. As a side note, the paper received coverage in Business Week, Chronicle of Higher Education, New York Times, and even USA Today. In closing, I would like to “challenge” Mr. Weiss to analyze a current or future writing project using a readability index. I believe that a very useful perspective will be gained on the difficult task of writing papers with a high degree of readability. — Ronald S. Lemos, Carson, CA R EVIEWER R ESPONSE I am delighted that Mr. Lemos found my review readable and understandable in spite of my deliberate use of artificially long sentences and multisyllabic words, and grateful for this opportunity to say again what he understood at once: that I think the Flesch readability index is a bunch of dangerous nonsense. I share Lemos's concern for the readability of computing periodicals, and I truly understood that this concern probably stimulated his article and the editors' acceptance of it. I did not cite this as the author's main point because he did not make the point clear and explicit. Instead, he said, perhaps with academic irony, “Readers are left to draw their own conclusions as to the appropriateness of the apparent degree of difficulty in readability evident in the magazines analyzed.” (There is a sentence that Flesch would deplore]) I took as the entire point of the article the one he identified in his second paragraph, saying, “In this article we are concerned with measuring readability,” and I undertook to criticize his measuring tool and the meaning that might be attached to measurements made with it. I agree with Flesch and Lemos that long words and long sentences make for hard reading, but I do not agree that short sentences and short words are all that are needed for easy and interesting reading, the common meaning of the word “readable.” I object to Flesch's arbitrary assignment of the word “readable” and the phrase “reading ease” to a measure that includes only these characteristics. Indeed, I deplore any purely mechanical attempt to measure with a single number such a multidimensional and indeed, artistic, characteristic as “readability” in its common meaning. I consider such efforts to be dangerous, for their use and acceptance will lead technical writers away from the effort to achieve clarity with unity, coherence, and emphasis and will substitute the false rule of “only short words and short sentences.” Lemos tells us that he has had many requests for the BASIC program that he used, so we can confidently expect this ridiculous definition of “readability” to spread. To emphasize the nonsensical nature of the Flesch Reading Ease Index I have taken up Lemos's challenge by rewriting my response with short sentences and words to achieve an RE of 117.943: Hey. Right. Ahhh. Thanks, Ron. You know, I am glad you wrote. I am glad you came on so strong. You read my piece. You caught my drift. You got my point. You heard me. Right. Yeah. Good. You got it. The Flesch thing is dumb. The Flesch thing is bad. No one with his head on right should use it. I wish we could stop it. Thank you, Ron, for the chance to say it once more. Good. You and I, Ron, we feel the same way about big words and long strings of words. We hate them. Right. They are hard to read. Right. Yeah. But that is not all. Short words and short strings of words can be hard to read and real dull too. Yeah. You know, there is more to easy to read than just short. Lots more. Right. Yeah. Stop Flesch while there is time] Hey]

Eric A. Weiss

The author has calculated the Flesch Reading Ease Index for ten computing periodicals. He reports that the Communications of the ACM is the most readable (classed as difficult, academic, and requiring two years of college to understand), while the IBM Systems Journal is the least readable (classed as very difficult, scientific, and requiring more than four years of college to understand). Three feature articles were systematically selected from four issues of each publication in the period from July 1979 to September 1980. In accordance with Flesch's guidelines, the analysis was based on five 100-word samples selected from each article. The periodicals measured were, in order from most to least readable, Communications of the ACM, Computer Journal, Computer Decisions, Mini-Micro Systems, Datamation, Infosystems, Data Management, Computerworld, Computer, and IBM Systems Journal. The paper raises two questions: what is the Flesch Reading Ease Index, and what does it mean to computing professionals who read and write__?__ The index is described and promoted in Flesch's book [1]. Lemos's paper gives references to IEEE Transactions which have published recent discussions of the limitations of readability indexes [2, 3], but the basic nature of the index can be grasped simply by looking at its defining equation: READING EASE = 206.835 ? 0.846 WORD LENGTH ? 1.015 SENTENCE LENGTH. A minor, but telling point is the ridiculous number of digits in the factors, which suggests an effort by Flesch to give the index credibility by making it appear precise. But the major point is that the only things considered are WORD LENGTH (number of syllables per 100 words) and SENTENCE LENGTH (average number of words per sentence). For example, one way of getting an index value of zero, which represents maximum reading ease, would be by using an average of 2.35 syllables per word and an average of 7.9 words per sentence. That is, material which uses short words and sentences will be classed as easy to read, while that which uses long words and sentences will be classed as difficult. No consideration is given to structure, style, organization, accuracy, correctness, ease of understanding, unity, coherence, or emphasis. Thus, the Flesch Reading Ease Index is an extremely limited measure, at best only indicating in a very crude way whether or not the words and sentences in the sample are, on the average, short. Certainly the use of short and simple words and sentences should be encouraged, especially considering the tendency of most technical writers to go to the opposite extreme; but it seems presumptuous, as critics of this and other indexes have pointed out, to call this a measure of reading ease or readability or clarity, ridiculous to calculate such a crude index to four significant figures, dangerous to encourage writers to use short-words-and-sentences uncritically, wrong to compare writings and publications on this basis, and foolish to pay much attention to such comparisons. (That was a 93 word sentence.) Thus, the article is marginally interesting, reporting that, to a first approximation, all ten of these computing periodicals use words and sentences of about the same length, and, to a less justified second approximation, suggesting that if Flesch's claim that the length of words and sentences that a reader can handle is related to his or her education, these periodicals will only be easily readable by people who have been to college. (That was a 72 word sentence.) Unfortunately, this is probably not the end of the matter. We can confidently expect that word processing programs will soon include readability index calculations. It will then be necessary for the proponents of good, clear, understandable, communicative writing to unify, cohere, and emphasize the dangerous meaninglessness of nonsense indexes.

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Communications of the ACM
Communications of the ACM  Volume 28, Issue 2
Feb. 1985
78 pages
ISSN:0001-0782
EISSN:1557-7317
DOI:10.1145/2786
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 February 1985
Published in CACM Volume 28, Issue 2

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)57
  • Downloads (Last 6 weeks)12
Reflects downloads up to 22 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2004)How effectively do marketing journals transfer useful learning from scholars to practitioners?Marketing Intelligence & Planning10.1108/0263450041055192322:5(540-556)Online publication date: Aug-2004
  • (1989)The cloze procedure: assessing the understandability of an IEEE standardIEEE Transactions on Professional Communication10.1109/47.2186132:1(41-47)Online publication date: Mar-1989
  • (1989)The cloze procedure: an assessment of the understandability of data processing textsInformation and Management10.1016/0378-7206(89)90016-517:3(143-155)Online publication date: 1-Oct-1989
  • (1989)Stalking the Typical Undergraduate Software Engineering Course: Results from a SurveyIssues in Software Engineering Education10.1007/978-1-4613-9614-7_11(168-195)Online publication date: 1989
  • (1987)Profile of undergraduate software engineering courses: results from a surveyProceedings of the eighteenth SIGCSE technical symposium on Computer science education10.1145/31820.31817(523-528)Online publication date: 1-Feb-1987
  • (1987)Profile of undergraduate software engineering courses: results from a surveyACM SIGCSE Bulletin10.1145/31726.3181719:1(523-528)Online publication date: 1-Feb-1987
  • (1987)Predicting readability of data processing written materialsACM SIGMIS Database: the DATABASE for Advances in Information Systems10.1145/1017816.101782018:4(40-47)Online publication date: 1-Jul-1987

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media