Tuesday, March 11, 2014

VocabKitchen: Awesome Vocab. Profilers!

 http://vocabkitchen.com/Default.aspx
http://vocabkitchen.com/Default.aspx
VocabKitchen's Academic Word List (AWL) Vocabulary Profiler (beta version) is already awesome! It quickly highlighted, listed, and sorted on- and off-list vocabulary in sample texts. It also was "dictionary enabled," which means you can get Longman Dictionary of Contemporary English Online definitions of almost any words by double-clicking on them (not a name like Tanaka). Though the percentages of on- and off-list words for the first sample I tried didn't add up (7% + 101%, respectively), text and highlight colours were easy to read on the white background.

VocabKitchen's General Service List (GSL) Vocabulary Profiler (beta version) performed similarly well on the same sample. Highlighting, listing and sorting was speedy, but again the percentages didn't add up (see below). It's still a quick, cool, easy-to-use tool. However, VocabKitchen's choice of tint for highlighting words in in the GSL 2nd 1K (RGB: 48, 152, 204) offered slightly less background-to-text contrast than had the AWL Vocabulary Profiler's on-list colour (especially when used on a grey heading bar), and the location of the sort button in the GSL-beta profiler differed for off-list words.


VocabKitchen's Common European Framework of Reference for Languages (CEFR) Vocabulary Profiler was available in both a beta and a regular version. The main differences seemed to be that 1) the CEFR-beta version was "dictionary enabled" like the AWL-beta profiler, and 2) the CEFR-beta version also provided an Export to Word function. Nevertheless, background-to-text contrast, at least for B1 level words and corresponding headings (RGB: 249, 154, 0), seemed a bit problematic.


On the CEFR-beta display, VocabKitchen avoided the low-contrast problem on grey header bars by using a shade close to black (RGB: 34, 34, 34). Something closer to black (0, 0, 0) might be even better!


VocabKitchen (also in beta), please keep up the good work. I'm looking forward to trying out the Social Reader Tool. In the mean time, I'll strive to cut down on off-list words–at least on this blog!

[328 words]

Tuesday, March 4, 2014

PaperRater Grammar and Spelling Check

"PaperRater.com is a free resource, developed and maintained by linguistics professionals and graduate students. PaperRater.com is used by schools and universities in over 46 countries to help students improve their writing" (About PaperRater, ¶1, 2014.03.04).
Screen snapshot of graphic on the PaperRater site
This post frames a comment that I posted today on the Digital Mobile Language Learning blog (Writing Tools for the Self-directed Learner Part 2, 2014.01.19), after trying out PaperRater, which provides, among other free services:
  • Spelling and grammar checks,
  • Style and word choice analyses, and
  • Readability statistics (PaperRater: Features).
After trying a Google+ post that was too short to ... [get] feedback on many of the categories, I gave Paper Rater (PR) another spin on a working paper I ... [had written] a while back. The whole paper turned out to be too long for free assessment, so I cut the sample back to the end of the first section: 719 words per PR's count, 777 per Microsoft® Word 2008 for Mac. 
PR results, as Dan [Ferreira] suggested they would be, were quite interesting: 
+/- The spell check flagged one word apparently broken in the PDF from which I'd copied the sample, but also returned a false positive on the word conflate.  
+ Grammatical analysis revealed no errors. 
+ The numerical score for inappropriate word choices was, I guess, low (0.998). 
+/- The feedback on style in the web display focused mostly on sentence length (the longest: 53 words), but mistakenly indicated that more than half of the sentences were passive. I checked by hand, but found only one passive clause–in a quotation. 
-/+ Though the feedback on style in the PDF output was different, focusing on transition words, the PDF included general tips for using such words. 
+/- The vocabulary score seemed high (96), yet the feedback included only a subset (9 of 20) sophisticated words counted. Paper Rater did not mark such words in the sample. 
The numeric grade from the auto-grader bore a note to the effect that it was "based on [a] college grading scale," which was followed by a stronger one indicating that PR "does NOT examine the meaning of your words, how your ideas are structured, or how well your arguments are supported" [(Auto Grader, NOTE, ¶1)].
All in all, PR looks like it's worth asking students to try, as long as they can grasp its limitations.
I'm looking forward to trying it out with students soon.

[413 words]
Related Posts Plugin for WordPress, Blogger...