EXPERIENCE A WHOLE NEW WAY
TO SURFACE TALENT!

About us

Staffing can make or break a company, but searching for talent can feel like a guessing game. Sometimes your best candidates are hidden in plain sight.
Restless Bandit ensures that companies avoid costly mistakes and don’t waste time considering the wrong people. Our tools make use of existing data on employees and new applicants to find strong matches for every opening. The best choice isn’t always the obvious one, and our hiring technologies will help you see the difference.
We deliver solutions that are rigorous and also make sense for clients. Our data scientists and engineers have extensive experience in this field and a proven track record. Your company's success relies on your people, so don't settle -- get restless. Decisions this important shouldn’t be left to chance.

Why Restless Bandit?

A prospector searching for gold, a director of a team of pharmaceutical researchers, and a hiring manager considering candidates for a position all face a similar problem: given the outcome of past decisions, what is the best choice at the next step? They must balance the tradeoff between “earning” by choosing known good options, and “learning” by exploring the new and unknown.

Sequential dynamic allocation problems like these are called “multi-armed bandit problems”, taking inspiration from a gambler facing a row of multiple slot machines (sometimes called “one-armed bandits”) who needs to decide with each coin which lever to pull. The gambler would obviously want to play the one machine with the highest payoff, but he can only learn a machine's payoff by playing it. What is the strategy to maximize the sum of the payoffs through the sequence of plays?

Most approaches to this problem assume that the metaphorical slot-machines are static, that whether they are played now or later, they yield the same information, but the world is seldom so constant. People are always gaining skills, experiences, and interests. Jobs are constantly evolving with business and technological changes. These dynamical multi-armed bandit problems are called restless bandit problems.

WORK WITH US

We want to alleviate the pains and stresses caused by inefficiencies in the hiring process, and we're looking for engineers, product managers, and otherwise smart and creative people to help us. If you want to get in on the ground floor to make an impact on one of the world's most important industries while working on a challenging and fascinating problem, come work with us.

[email protected]

CONTACT US

Want more unlimited storage? Refer more friends.

Thank you! Your submission has been received!

Oops! Something went wrong while submitting the form