CYF Docs
We now have a partnership with Codility to run automated assessments.
What is evaluation for?
Module Template


  • One test per module to start
  • Autograded
  • Mandatory
  • Diagnostic tool not exclusion tool

Why do we use automated tests?

Some reasons we think this is a good thing to do:
Helping trainees get comfortable with technical tests: Codility is a real platform used by companies hiring software engineers. It's better to be comfortable in this kind of environment beforehand. We want trainees to have lots of room to get things wrong and figure stuff out in a friendly and supportive environment.
A standardised check in: to see where trainees are in their development and to figure out what we can work on most effectively, so people don't get lost in the crowd.
Clearing space for mentorship: Buddies shouldn’t be rote marking, and manually updating spreadsheets. We want to lift these burdens as much as possible from volunteers at CYF to clear space for meaningful, personal technical mentorship. This is part of that quest.
Revealing problems systematically: if one trainee has a mental model error, that’s something to work on with that trainee. If half the trainees makes the same mistake, that’s something to work on in our syllabus. Codility has a dashboard that makes it a bit easier to interrogate this data and reveal these issues.

How does it work?

We are working on going through all our previous tests, quizzes, and other evaluations to define an initial Module Exit test per module. Trainees will take one test per module, at the end. It’s a timed test taken once and without help. This is a work in progress and we should expect to revise and adapt this with data.
A test will be opened; a class will be invited by email. Sometimes we will run the test in class. The test will stay open for 7 days. At the end of 7 days the test will be closed and anyone who hasn’t taken it will get a zero. A test will be an hour or less.
Candidates must provide their Github url when asked or we will have to burn valuable volunteer hours/days figuring out who is who.

What else should we know?

These tests are a blunt tool and will deliver false negatives, so we will never use a percentage mark on Codility as the only metric when figuring out how a trainee is doing. The Codility system is largely geared towards recruitment at scale, for screening out people who cannot code. We will always get more false negatives than false positives. False positives are unlikely.
We can replay any test, which should make it easier to identify key blockers in an individual’s understanding. There is no expectation on volunteers to go and do this routinely. This is part of reducing the marking burden on Buddies, not increasing it.
We are still thinking about ‘retakes’ and how useful they are -- let us know what you think.
We ran a pilot with 22 trainees, who tested the platform and also gave us feedback on the experience. The feedback was overwhelmingly positive.

What do trainees need to know?

Trainees must take the test on their own. No pairing on Codility.
The interface is a bit similar to Codewars, so you could do some practising with Codewars, as it will develop many of the skills needed for Codility.
1 - Use the tests
The key thing missed by many people in our pilot was how to use the tests or even that they should use the tests. The sample tests are there to help. Trainees should frequently test before they submit and use the results to help improve their score. The assessment records every time the tests are run. A successful coder is in the habit of testing often and showing that they test often.
2 - Read the requirements
The next biggest issue people had is related to not using or reading tests - and that’s not interpreting the requirements accurately. Codility is an opportunity to practice interpreting requirements and using time effectively.
For example, in our pilot, people wasted time adding Bootstrap classes to their HTML, even though there was no Bootstrap CSS available to them and styling was not evaluated at all in the sample tests.
3 - Use the tools
There’s an onboarding introduction and a demo. Do the demo! Use every tool available. This includes Google and the Syllabus.


Can I use VSCode?
Trainees may use an IDE and paste in, but this mean mentors won’t get the benefit of the replay, so we won’t record the key insights we might need to help them.
If you do want to use VSCode, paste and test iteratively, instead of all in one go.
If you have a disability and need extra time, we can set that up in advance. There’s also a button in the interface to signal this, but it doesn’t tell us anything so we also need some info about what is needed!
Trainees, please contact your Buddy about this.
If something goes wrong outside of your control, like, the internet goes down or a baby won’t stop crying, you know, it’s ok. We can rerun the test for you. This is explained in the welcome message.
Trainees, please contact your Buddy about this.
The results go to the CYF staff account on Codility: individual test takers don't get a mark at the end. To understand how correct your code is you should use the tests.