Nat aa45182f16 | ||
---|---|---|
examples | ||
.gitignore | ||
README.md | ||
clean-data.py | ||
practice-exam-generator.py | ||
template.html |
README.md
CLRS practice exam generator
Generates practice exams by randomly selecting questions and their answers from the CLRS textbook
Why?
Interleaved practice is when you mix practicing a bunch of subjects together. Textbooks are designed such that all the questions relevant to a particular topic are grouped together. While this is good for reference, it's not all that good for practice.
How?
Peng-Yu Chen and contributors to walkccc/CLRS have collected solutions to many (but not all!) of the problems in the third edition of CLRS. They have also conviently and seemingly accidentally structured them in a way that lends itself well to being parsed into a database. I've written a script that automatically sorts all of these problems into an SQLite database, and another you can use to query it and generate nice-looking HTML practice exams.
This is probably best described as something that barely works. As in, it's something I duct taped together while bored in class and bothered to do very little error-handling.
Not to mention, walkccc/CLRS
is incomplete as far as data sets go. It's the largest and most well-formated that I'm aware of, anyways.
How can I generate new practice exams?
This setup assumes you have Python 3, Git, and the cmarkgfm
Python package. Before you get started, make sure you have the cmarkgfm
package installed. On Debian:
$ sudo apt install python3-cmarkgfm
Or with pip, call:
$ pip install cmarkgfm
Next, you can run this script if you're on some kind of Unix machine. Something similar on Windows but idk how that stuff works.
# 1. Make yourself a workspace
mkdir work
cd work
# 2. Get the problems
git clone https://github.com/walkccc/CLRS
# 3. Get the scripts
git clone https://git.nats.solutions/nat/clrs-practice-exam
cd clrs-practice-exam
# 4. Clean the data
python clean-data.py ../CLRS/docs
# 5. Generate the practice exam
python practice-exam-generator.py "select * from problem where chapter in (3,4,7,12,15,16,22,24,26,34) order by random() limit 20" --template template.html
Specifics
practice-exam-generator.py
takes a positional argument that's an SQL query. The query in the example above is the one I used to generate the example exams, and it's kind of but not super reflective of the stuff I've covered in my algorithms course.
The relevant table in the database is made like this:
create table if not exists problem (
problem_number number not null,
question text not null,
answer text not null,
chapter number not null,
section number,
starred number not null
)
So, you can use all of this information to construct your SQL query.
Both scripts take an -h
argument that breaks down what you can pass to it to tweak things based on your setup.
The template.html
document should probably include KaTeX lest the many, many LaTeX equations be broken. This template must include the string "%$%content" somewhere in it; it'll be replaced with all the selected problems when the exam is being generated
"License"
This software (that is, the scripts; not including anything you'll find in ./examples
) is a gift from me to you. You form a relationship with all those from whom you accept gifts, and with this relationship comes certain expectations. Namely:
- When you share this gift with others, you will share it in the same spirit as I share it with you.
- You will not use this gift to hurt people, any living creatures, or the planet
The questions were written by Michelle Bodnar and Andrew Lohr and used here on a fair-use basis. The answers are organized in the walkccc/CLRS repository, by Peng-Yu Chen and contributors. Their repository is shared under the MIT license.