Hatchways uses a real-world, customizable take home assessment process to help you get more signal on your applicants and hire better. With Hatchways assessments, candidates work in a GitHub repository and make pull requests to submit code for review, often building off an existing code base. The Hatchways platform has all the tools you need to assemble customizable assessments, send, and track assessments, as well as premium features to save your teams’ time such as ATS integration and automated or personalized review of your candidate submissions by the Hatchways team of experienced engineers.
To get started, you can sign up here! You can pick an assessment to send to your applicants from the assessment Catalogue page, or upload your own existing assessment. Invite candidates by email or share a general sign up link, then follow their progress on the Candidates tracking page. From the Candidates page, you can also see more details about each applicant’s progress as well as access the GitHub repository we created for their assessment.
If you click View details for a candidate, there is an Archive Candidate button. This can be used to move the candidate to the Archived column on the page once you no longer need to follow their progress. This can be undone at any time. If you would like to keep a candidate from completing or viewing their assessment, such as if they missed a submission deadline, you can use Disable Assessment to disable their assessment page.
Pricing information can be found here on our website
Yes, we do provide both monthly and yearly billing options! You can be charged on a monthly basis with no annual commitments.
Kindly reach out to your Accounts Manager to initiate the cancellation.
Feel free to reach out via hello@hatchways.io if you have any questions.
Our assessment Catalogue has filters that can be used to narrow assessment options to those that match the role type, experience level, and languages for the position you are trying to fill. From there, we recommend reviewing the details of the assessment options to pick one that assesses skills most in line with what you would like to evaluate in applicants.
The experience level is determined by both the complexity of the topics that are used in the assessment as well as how much the candidate is expected to complete within the recommended time frame. For example, a junior assessment would be simple and/or require less total output compared to a similar senior assessment of the same length
Each assessment in the Catalogue has a recommended Time to Complete. This is the average time span that most candidates of the recommended level will take to complete the entire assessment. These are general guidelines, and we recommend editing the wording of the candidate instructions if you would like to add a loose or strict time limit for completion.
When candidates complete an assessment, you can see the Total Completion Time for their submission on the View Details tab on the Candidates tracking page. Please note however, that this is simply the time from when they started to when they finished, and does not account for breaks taken, or if the candidate started the assessment but did not sit down to work on it until later.
You can modify any existing assessment to better fit the skills you need to vet for, or you can work with Hatchways to build a custom assessment tailored to your use case.
Each assessment in the Catalogue has a Request Language button on the Details page that can be used to let us know what language you would like to see added. If you request the needed tech stack for your role, Hatchways can custom create a version of the assessment in that language.
Yes! If you modify an assessment from the Catalogue, it will be stored in Your Assessments. To make a new copy of the original assessment, return to the Catalogue and hit “Select” again on the original assessment. This will add a second copy of that assessment to Your Assessments, which can be independently modified. This can be used, for example, to make different versions of an assessment for different levels of applicants.
Yes! The New/Import button in the dashboard can be used to upload your existing assessment. We can copy assessment starter code from either a public GitHub repository or from an uploaded zip file.
Hatchways has protocols in place to prevent plagiarism, such as reviewing submissions for identical code that has been submitted before. We also restrict what email domains can be used to create a Hatchways account and have monitoring in place to determine if a candidate invited to complete an assessment attempts to sign up for a company account. To make your assessment less prone to cheating, we recommend editing the assessment to fit your unique needs, or working with Hatchways to build a new, custom assessment.
A good assessment for technical interviews should effectively test relevant skills without overwhelming the candidate. Focus on optimizing the signal-to-time ratio to gather useful information about candidates while maintaining their engagement throughout the process.
Key steps to create such an assessment include: identifying essential skills and topics, refining the list to core skills, creating a starting codebase (optional), testing the assessment, and developing a rubric while monitoring metrics.
You can read this blog post to see how to create a good React assessment.
This is not necessary and it depends on the skills you are assessing. For more practical skills, it is best practice to provide the candidate with a starting codebase to work from, as this will allow them to focus on the essential skills you're testing. This approach allows you to learn about the candidate without overwhelming them, which could deter them from continuing in your interview process. Having candidates start from scratch wastes time on setting up and incorporating boilerplate code.
For an assessment you find in the Catalogue, there is a “Modify” button on the Details panel for that assessment. For any assessment in Your Assessments, the “Edit” button will take you to the edit page.
A starter code is a template we’ll use to generate a GitHub repository for candidates to work off of during their assessment. It can contain whatever you’d like the candidate to have in their repository, such as a starting code base, or just a README. You can add multiple starter codes to offer candidates a choice between starting code bases of different languages.
Hatchways assessments all use a starter code repository as a starting place for candidates to build from, so each language to be supported will typically have a matching starter code base. You can disable a starter code to remove that language option by disabling the toggle on the Starter Code list of the edit page, and can add a new starter code to add a new language option. To update the language label for an existing starter code, you can remove and add labels in the languages box below each starter code URL.
Another option is to use a single empty or generic repository and allow any (or a specified range) of language options. In this case, we suggest creating one Starter Code and setting the language to “Language Agnostic”, and specifying any general restrictions in the ticket description.
If you want to create an assessment without a starter code repository, you can select “skip” on the Starter code section when uploading an assessment. This will create an empty GitHub repository that will serve as the starting point for candidates, with a language label of “Language Agnostic”.
First, you will need to invite a GitHub username to get access to the starter code repositories to be able to make edits. To do this, use the Repository Access side bar on the assessment edit page. This will grant write access so that you can directly make changes to the repository. To update the version of the starter code that is used in the assessment, you will need to add a tag for your most recent changes and select the new tag from the drop-down list next to that starter code on the assessment edit page.
Markdown formatting can be used in the problem description and the ticket description. If your assessment offers more than one starter code in different languages, you can also reference the language the candidate selected to use in the ticket description, or vary language based on that language. This can be done using jinja2 templating in the following format:
{% if "node.js" in technologies %} Node specific wording {% elif "python" in technologies %} Python specific wording {% endif %}
In this templating, “technologies” is the list of options added the the language labels for the assessment starter codes.