Designing a React take-home assessment for technical interviews can be a daunting task. Striking the perfect balance between testing relevant skills and not overwhelming the candidate is crucial. In this blog post, we'll walk you through the steps to create an effective React take-home assessment.
Step 1: Identify key skills and topics to test
Start by determining the key topic areas and skills you'd like to assess. Consider the work you do daily, the seniority level of the candidate, and the specific React concepts and core engineering skills you'd like to evaluate.
Here is a possible list of React-specific skills:
Writing organized and maintainable code
- Your team might place significant importance on well-structured code, especially when dealing with a large codebase. This can involve creating reusable components in React, prompting candidates to refactor specific code segments for enhanced reusability. Additionally, this subject can be connected to state management, exploring ways to better organize code and prevent props drilling.
State management
- One of the fundamental abilities in React involves managing state across various components and extensive web applications. State management often comes into play in scenarios such as intricate forms (consider situations like conditional form validation, multi-page onboarding forms, or auto-complete features), as well as data searching and filtering (for instance, pagination, crafting multiple filters, and search functionalities as seen on e-commerce websites). By reviewing your previous pull request conversations or examining your codebase, you'll likely discover numerous other examples.
Styling
- Your team may value styling and creating pixel perfect designs. Or on the other hand, styling is not an important consideration since your application is built on a component library. Depending on your team’s values, it could be worthwhile to evaluate a candidate's proficiency in employing a component library, replicating a Figma design, or demonstrating similar skills.
Debugging + Testing
- If your codebase is particularly complex or your team places great emphasis on test cases, you might want to design an assessment that gauges one's aptitude for resolving bugs, repairing test cases, writing new test cases, or contemplating further edge cases.
Writing organized and maintainable code
- Your team might place significant importance on well-structured code, especially when dealing with a large codebase. This can involve creating reusable components in React, prompting candidates to refactor specific code segments for enhanced reusability. Additionally, this subject can be connected to state management, exploring ways to better organize code and prevent props drilling.
Asynchronous programming
- Occasionally, your application may involve numerous asynchronous operations, necessitating that candidates possess a solid understanding of the event loop or JavaScript promises. A prime example of this could be requiring candidates to retrieve data from an API (or multiple APIs) and present the information on-screen.
SEO and accessibility issues
- At times, you might aim to assess more specialized HTML expertise, asking candidates to showcase their understanding of accessibility or the development of an application with a focus on SEO.
- Security
- If your organization prioritizes security, consider designing a challenge centered on front-end token storage, session management, user account handling, or addressing prevalent security threats (such as those listed in the OWASP Top Ten).
Improving performance and architecture
- For seasoned engineers, you might aim to evaluate their capabilities in enhancing the performance and overarching architecture of your application. Consider assessing their knowledge of preventing re-renders in React (which complements state management), minimizing repetitive API calls through debouncing or throttling, scaling the application via caching, or addressing general concepts related to preparing the assessment for production (e.g., Babel, minification, app building, and so on).
And here are a list of more general engineering skills you’d want to consider:
Reading vs. Writing Code
- Developing an assessment focused on writing code is a common approach, but occasionally, you may want to gauge a candidate's proficiency in reading code, particularly for more experienced candidates. Contemplate designing an assessment that entails performing a code review.
Written communication
- Think about posing follow-up questions subsequent to a code-writing task. This method enables you to evaluate more sophisticated concepts without obligating candidates to invest an inordinate amount of time in the assessment. For instance, you might inquire about scalability or getting the project ready for production.
Taking feedback & asking questions
- Contemplate incorporating a round of asynchronous feedback for the take-home assessment. Conducting this via GitHub and pull requests has proven to be an effective workflow, or alternatively, inviting candidates to join a Slack channel can be another suitable approach.
Verbal communication
- An effective method to assess verbal communication skills involves posing several follow-up questions and requesting candidates to record a Loom video of their responses, rather than providing written answers. Companies that frequently use Loom videos have found this approach to be a valuable addition to their React take-home assessments.
Speed
If coding speed is a crucial factor, explore methods to time the assessment. We recommend a few options:
- Establish a calendar block (using a tool like Calendly) for candidates to complete the task within a specific time frame
- Monitor the duration or proctor the assessment using an online coding assessment platform
- Provide a suggested time frame for completion and utilize follow-up meetings to determine the time taken
Step 2: Refine that list to a few core skills
Naturally, you are going to want to test more skills than you can feasibly do in an assessment. You’ll need to refine that list and start developing the rubric you’d like to use to evaluate candidates. Initially, this can be a straightforward list of skill categories you'd like to measure. It's advisable not to create a fully-fledged rubric at this stage, as it will likely be adjusted as you develop and test the challenge.
As you refine the list, consider the assessment's position within your interview process. If it occurs early on, it may be more appropriate to concentrate on essential skills and maintain a shorter assessment. However, if the assessment is later in the process and serves as the primary evaluation method, feel free to explore more complex concepts and allot additional time for completion.
Step 3: Create a starting codebase
It is best practice to provide the candidate with a starting codebase to work from, as this will allow them to focus on the essential skills you're testing. When designing an assessment, optimize the time-to-signal ratio, aiming to gather the most information in the least amount of time. This approach allows you to learn about the candidate without overwhelming them, which could deter them from continuing in your interview process. Having candidates start from scratch wastes time on setting up a React app and incorporating boilerplate code. For instance, if assessing state management knowledge, complex issues typically arise in larger codebases with numerous components. Without providing a starting codebase, simulating a suitable problem would require excessive work from the candidate.
Here are the general steps on creating a starting codebase. Based on your refined list of core skills, brainstorm a project theme and a related task. Here is an example:
- Skills: State Management, Writing organized and maintainable code, Improving performance
- Possible Tasks or Themes: implement an auto-complete feature (using debouncing) on a checkout form for an e-commerce platform, add pagination logic for search results on an events booking app, refactor and reduce re-renders of a movie filtering application
Once you have a theme and task to do, it’s time to start writing the initial codebase and the task instructions. We have found ChatGPT is a great tool to help with both coming up with starting codebase ideas as well as writing initial codebases/tasks.
Step 4: Test the assessment
Before sharing the assessment to candidates, have someone else test it to ensure it is the appropriate length and difficulty. Employees are great for this, as they can provide valuable feedback from a developer's perspective. Remember that you will likely need to iterate on the assessment to optimize the signal-to-time ratio.
Step 5: Develop a Rubric and Monitor Metrics
As you share the assessment with candidates, you are going to felsh out more details of the rubric you created in step 2. Continuously iterate on the assessment and rubric based on candidate feedback and performance. Two key metrics to help you optimize the assessment are:
- Withdrawal rate: Aim for a low withdrawal rate by ensuring the assessment is engaging and time-efficient. Collecting candidate feedback through Net Promoter Scores (NPS) can provide valuable insights.
- Pass-through rate: Track how well candidates who perform well on the take-home assessment do in the next stage of the hiring process. This will help you refine your assessment to better correlate with overall candidate success.
Optimizing on these two metrics can go a long way and can also create cost savings in the long run. To see how these metrics can save you cost, check out this article here.
Conclusion:
And there you go! If you'd like me to evaluate your take-home assessment or offer additional insights on accelerating your assessment creation process using ChatGPT, feel free to send me a direct message on LinkedIn. If you want to learn more about our more general practical assessment philosophy, check out this article.
Do you want to try a great React take-home experience? Try an assessment here.