Assessment Details

4,209,183 All Time Assessments

Learning Styles

The learning styles scale embedded into SmarterMeasure is an original, proprietary assessment based on the multiple intelligences approach to identifying a person's dominant learning style(s).

Individual Attributes

The scale of SmarterMeasure which measures individual attributes is an original, proprietary assessment based on the dissertation research of Dr. Julia Hartman.

In her dissertation she identified the individual attributes which are significant predictors of success in an online learning environment. These are variables such as motivation, procrastination, time availability, and willingness to seek help. The individual attributes section of SmarterMeasure measures these variables which are indicators of success in an online course environment.

Life Factors

The Life Factors scale in SmarterMeasure is an original, proprietary assessment that was designed based on formal and informal feedback which was submitted by faculty and administrators of several schools which use SmarterMeasure.

On-Screen Reading Rate and Recall

The on-screen reading rate and recall scale of SmarterMeasure is an original, proprietary assessment which was developed by an expert panel of educators representing institutions which are clients of SmarterServices in cooperation with LiteracyWorks.org which is a project of the National Institute for Literacy.

Both reading rate and recall are measured in SmarterMeasure because students should realize that they must not too rapidly read on-screen course content because they may be assessed on the content in their courses.

SmarterMeasure is used by secondary schools, technical colleges, community colleges, universities and corporations. To best fit the needs of the learners of each of these organizations, several reading passages are available. Institutions using SmarterMeasure may select per login group the reading passage that is most developmentally appropriate for that group of learners. The following passages are available:

Grade Level Topic Flesh-Kincaid Grade Level Flesh Reading Ease Number of Words
8 Pencils 8.3 59 471
9 Cell Phones 9.8 52.4 510
10 Neil Armstrong 10.1 53.8 635
11 Information Literacy 11.5 38.7 414
12 Batteries 12.9 42.8 652
13 Contact Lenses 13 40 720

The Flesch/Flesch-Kincaid Readability Tests are designed to indicate comprehension difficulty when reading a passage of contemporary academic English. The two tests are the Flesch-Kincaid Grade Level and the Flesch Reading Ease. Although they both use the same core measures (word length and sentence length), they have different weighting factors, so the results of the two tests correlate imperfectly: a text with a comparatively high score on the Reading Ease test may have a lower score on the Grade Level test. Both systems were devised by Rudolf Flesch.

The "Flesch-Kincaid Grade Level Formula" translates the 0-100 score to a U.S. grade level, making it easier for teachers, parents, librarians, and others to judge the readability level of various books and texts. It can also mean the number of years of education generally required to understand this text. The result is a number that corresponds with a grade level. For example, a score of 8.2 would indicate that the text is expected to be understandable by an average student in 8th grade (usually aged 13-14 in the U.S.).

In the Flesch Reading Ease test, higher scores indicate material that is easier to read; lower numbers mark passages that are more difficult to read. For comparison the Readibility Index of the Reader's Digest is about 65, Time Magazine is about 52 and the Harvard Law Review is in the low 30s.

The degree to which the learner can recall the information in these passages is measured by ten questions. There are two of each of the following types of questions: sequence of events, factual, inferential, cloze, and main idea.

Participants are not allowed to view the reading passages while taking the quiz. As such SmarterMeasure provides an assessment of reading recall, not reading comprehension. The intention of this component of SmarterMeasure is to measure the degree to which a person can read academic information on-screen and then recall that information on a quiz. This is a task that is frequently replicated in online and technology rich courses.

It should be noted that the reading rate and recall section of SmarterMeasure should not be used as an exhaustive reading skills inventory. Rather, it should be used as a screening device to identify learners who may be having difficulty recalling what they have read on-screen. If a learner is identified as having opportunities for growth in this area, the school can then inform the student about the resources for remediation and support which they provide. Communicating these resources can be automated through the feedback mechanisms of SmarterMeasure.

National Institute for Literacy Banner

Literacy Works Banner

Technical Competency

The technical competency and typing scales of SmarterMeasure are original, proprietary assessments and were initially developed by Dr. Mac Adkins. Dr. Adkins holds an Ed.D. from Auburn University in Educational Leadership with an emphasis on instructional technology. Dr. Adkins was one of the authors of the Alabama Course of Study in Technology used by all public schools in Alabama. He was also a participating writer for the National Education Technology Standards (NETS) for Teachers document published by the International Society for Technology in Education. Dr. Adkins also teaches Administration and Leadership of Distance Learning Programs online for Capella University. Since the initial iteration of these scales they have been revised numerous times by input from schools which are using the assessment.

The premise of the technical competency section is that if students do not possess basic technical competencies, they will quickly become frustrated and may drop out of the online course. The tasks measured in the technical competency section are basic technology skills which a learner should possess to begin studying online.

Typing Speed and Accuracy

Average typing speeds of persons who type regularly in their occupation range between 50 to 70 words per minute. Average typing speeds for the general public are considered to be around 30 words per minute. Between July 1, 2009 and June 30, 2010 a total of 152,130 students completed the typing section of the SmarterMeasure assessment. The average adjusted typing speed of these students was calculated to be 27.64 words per minute. This slower average rate of typing is a factor that should be considered by schools as they design online courses and by students as they plan for their time to participate in online courses. The formula used to calculate the average adjusted typing speed among students who took SmarterMeasure was to divide the number of words by the number of seconds and subtract for the number of errors.

The scales that measure Typing Speed and Accuracy are original, proprietary skills tests that were internally designed.

Adjusted words per minute

Adjusted Typing Speed
N 152,130
Mean 27.64
Median 26
Mode 21
Standard Deviation 11.997
Decile (10%) Typing Scores
1st top 10% 44+ WPM
2nd 10% 37 - 43 WPM
3rd 10% 33 - 36 WPM
4th 10% 29 - 32 WPM
5th 10% 26 - 28 WPM
6th 10% 23 - 25 WPM
7th 10% 20 - 22 WPM
8th 10% 17 - 19 WPM
9th 10% 13 - 16 WPM
Bottom 10% 12 or less WPM

Although the average adjusted typing speed of these students is lower than the general public, this may partly be explained by high levels of typing accuracy. The nature of academic assignments prompts students to be more concerned with typing accuracy than speed since inaccurate words could negatively impact their grades on the submitted assignments. On a scale of 0 to 100% the average typing accuracy of these 152,130 students was 92.41%.

Typing Accuracy
N 152,130
Mean 92.41%
Median 98%
Mode 100%
Standard Deviation 16.956