Skip to main content

How to pass our technical screening

Reading: How to pass our technical screening

People who want to join LeadingAgile in a technical consulting role help us gain a sense of how they approach common problems by working through one or more hands-on exercises. Just as a picture is worth a thousand words, a demonstration is worth a thousand interview questions.

Somehow, a lot of candidates seem to miss the point of the hands-on exercises. We’ve examined the instructions carefully to see if we forgot to mention something important, and we don’t see what’s missing. Of course, anything can be improved, and feedback is always welcome. Yet, around nine out of every ten candidates seem not to understand what we’re asking them to do. So, we thought it might be a good idea to clarify.

The point is not to try and trip you up with a crazy, complicated exercise. We want to give you the opportunity to show us how you generally approach the types of work that you claim to be able to perform. We only ask you to show us the things you say you can do, and only at the level of proficiency you claim to have. (Hint: Re-read the previous sentence and review your résumé. Exaggeration and gold-plating will not serve you well.) We understand that nobody knows everything, and we don’t expect that. After all, it would hardly be fair for us to ask you to do more than we can do ourselves, and none of us knows everything.

Types of work, levels of proficiency

We’ve tried to put together exercises that cover the main types of work people do in software development and delivery as they pertain to our consulting services, and to cover three general levels of proficiency in each area.

What are the main types of work? Well, there are many, as you can well imagine. To be practical, we’ve reduced the list to just a few. They are centered on skills relevant to application development and delivery. You might disagree with some of the words, but here’s the gist of it:

  • Programming – includes architecture, design, coding, and test automation
  • Database – focuses on application-related database skills rather than DBMS tuning and troubleshooting
  • Testing – includes manual software testing techniques
  • Analysis – includes general logical thinking as well as customer interaction skills for eliciting information
  • DevOps & sysadmin – includes basic sysadmin skills for mainstream platforms and basic DevOps skills such as setting up continuous integration servers and continuous delivery pipelines.

You may be familiar with the software engineering career model created by Meillor Page-Jones. It’s available from Wayland Informatics. We’ve found that people at the Apprentice, Practitioner, and Journeyman levels tend to be most compatible with the kinds of work we do for clients. A Master who has a personal love for hands-on work could be a good fit, too. But we aren’t a research firm, so we wouldn’t have much work for a Researcher; neither are we a low-end staff augmentation firm, so we wouldn’t have much work for a person who hasn’t achieved Apprentice level proficiency in at least a couple of key skill areas and can demonstrate deliberate movement forward in his/her career growth. We carve out the middle of the Page-Jones model and focus on that.

Combining those two factors, the picture looks something like this:

What we aim to do is to offer at least a couple of relevant hands-on exercises in each of the cells of that grid. As a candidate, you can then select the ones you feel will best represent your skills and allow you to demonstrate your approach to each category of work. Eventually this material will be online, but for the moment we’ll have to discuss it in a phone screen and choose appropriate exercises.

A well-rounded developer has at least some skill in all those areas. Even so, different individuals have different strengths, interests, and backgrounds; and each skill area is a demanding discipline in its own right. It would be unreasonable to expect anyone to be a Journeyman or Master across the board. The reason to have multiple exercises available at multiple levels is so that you can show your core competencies adequately while also demonstrating different levels of proficiency in other areas of software development and delivery.

For instance, you might be very strong in programming, pretty good with database work, and just “okay” at testing and DevOps. There’s nothing wrong with that. The trick is to choose exercises that allow you to show your skills in the best possible light.

You might surmise from this description that a person who has skills in just a single area might not be as interesting to us as a person who has skills in three or four areas, even if the specialist has extremely strong skills in one particular discipline. This is because a consultant has to be able to work with a wide range of clients who have different needs, with the bulk of the demand around application delivery and support. It isn’t a critique of specialists as such.

What does a technical consultant do?

Our technical consultants generally do four things, and they may have to shift from one to another many times in the course of a day:

  • Delivery – doing the work alongside client personnel. In this mode, the consultant models the behaviors we recommend our client teams adopt. This means “walking the walk” with respect to the recommended technical practices in a way that provides a living example of “doing the right thing.” It means a disciplined application of technical practices, not a loose approximation. This will be expected whether you’re embedded with a client team at their location, or working in our studio with or without client personnel present.
  • Coaching/mentoring (different people interpret those words differently) – showing/demonstrating/teaching technical practices in the context of real work. This is neither abstract teaching (as in a classroom) nor direct contribution to deliverables, but rather guiding people to shape their understanding, adoption, and use of key technical practices.
  • Consulting – providing the technical voice in the consulting advice we offer clients. In this mode, the consultant may offer suggestions for reference architectures, software products, team composition and structure, and other “technical” considerations to help clients achieve their goals.
  • Supplementing the management coaches – a technical consultant needs to be able to do the same things as the non-technical consultants do for clients. That means learning and understanding deeply the LeadingAgile organizational transformation model and being able to teach it and guide client personnel in applying it. (This is not a skill you will be able to demonstrate through the technical exercises; it’s mentioned here for context.)

Bonus points for:

  • Writing – contributing to the LeadingAgile blog (Field Notes), your own blog, or other publications
  • Speaking – appearing at user group meetings, facilitating code dojos or hackathons or other events, speaking at conferences
  • Having other interests in life – LeadingAgile people are active in charitable organizations such as Habitat for Humanity, and everyone has hobbies or deep personal interests that help to round out their lives. This is valuable for developing a balanced perspective on life, and on a practical level as a stress-relief valve; the lifestyle of a traveling consultant is inherently stressful. A one-trick pony who slings code 24×7 would probably burn out quickly. That would be unfair to both of us.

Misteaks wee hav sean

I mentioned that about nine out of ten candidates seem to misunderstand what we’re looking for with these hands-on exercises. Here are the most common mistakes, as well as we can understand them. Fortunately, they’re all pretty easy to avoid.

  • Neglecting context
  • Exaggerating
  • Ignoring (overlooking?) the instructions

Remember the context of what you’re doing is a job interview. Sometimes, when the candidate is someone we know–maybe they worked with one of us in the past, or they’re a “known quantity” in software development circles–they assume they don’t have anything to prove. The purpose of the exercises isn’t to “prove” anything, but rather to enable you to demonstrate how you approach problems and to give us a sense of how you are likely to function in the four modes mentioned above (well, the first three, anyway). The person at LeadingAgile who happens to know you will not be the only one assessing your exercises. Use the opportunity to show us how you would guide novice practitioners in the application of technical practices. Don’t treat it the same as if you and a buddy were playing with software over a few beers.

Be careful about overselling yourself. We understand that when you’re trying to get someone’s attention at a conventional company, you have to scream louder than the next person in line. We’re not a conventional company. Some candidates have gone over the top in describing themselves, and then they were overwhelmed by the hands-on exercises we asked them to do. Had they been able to do what they claimed they could do, the exercises would have been fairly easy for them. You don’t have to be an expert at everything. After all, no one is. Just say what you can do and then do it with a high degree of professionalism.

Read the instructions for each exercise and provide the things that are requested. You’ll notice the instructions are more detailed for Apprentice-level exercises than for Journeyman-level exercises. That’s intentional, as a more-senior person ought to be able to fill in a few blanks. But the instructions are very clear about a couple of things.

First, the sequence of commits you make to version control will be examined as a trail of breadcrumbs to help us understand how you approach the work. Most candidates only do two commits: An “initial commit” when they create the repository, and a single commit that contains their completed solution. That isn’t showing us what we’re interested in. We already know what a completed FizzBuzz solution looks like. We’re interested in the specific steps you take as you drive an emergent design through microtests.

A few candidates showed us their trail of breadcrumbs by exchanging emails. Not to be unfriendly, but we don’t have time to respond to 10+ emails that basically say, “I’ve added two more microtests.” You wouldn’t use email for that level of communication normally. It would be better if you showed us that you know how to bundle your changes into reasonable commits and use commit comments to communicate the sequence of steps in evolving the design.

Second, we ask you to include a text file in your project containing your thoughts about the approach to the task and, if it’s a programming exercise, the design of the solution. Most candidates neglect this.

There is a third point, as well. We expect–but we don’t say so in the instructions–that you will deliver a well-packaged solution. One candidate complained about this after receiving feedback on his project, saying he thought he was providing results for fellow developers to look at, and not for end users. Let’s be clear about a couple of underlying ideas, here:

  • Whatever a person normally does over and over again is what that person will do when the pressure is on. If you don’t package your solutions properly as a matter of habit, it’s very possible you won’t do it when you’re delivering real work for a client. At least, that’s how it appears when we are examining your exercise.
  • There’s a longstanding pearl of wisdom in software design that has been expressed in various ways. It has been called the “principle of least astonishment,” the “rule of least surprise,” and many other similar formulations. Even as a “fellow developer,” I expect your solution to follow conventions. If you submit a Node solution, I expect to be able to run npm install and npm run. If you submit a Java/Maven solution, I expect to be able to run mvn test. If you submit a Ruby solution I expect to be able to run bundle install and bundle exec rake. And so on. Every developer community has its particular conventions. Either follow convention or document the steps necessary to run your solution. If you just sling some code and walk away, you’re giving us no reason to expect you to do anything differently on a consulting engagement. We treat our clients better than that. We ask that you treat us better than that, too.

Bottom line

The hands-on exercises aren’t intended to be cute, clever, overwhelming, a school coding project, a time-waster, or anything of the kind. They’re meant to help us understand how you generally approach the work. Just remember the context, be straightforward, and pay attention to the instructions. We want the right people and you want the right job. Maybe that’s you and us. Maybe not. Let the chips fall.

Next Making Goal- Question- Metrics work in Agile with John Tanner

Leave a comment

Your email address will not be published. Required fields are marked *