Two-phase prize design

We are considering splitting the prize into two phases.

Phase 1: Qualification Round

Teams submit proposals to rapidly reskill workers in one month (or less) for one chosen occupation.

The occupation must be:

  • Growing faster than the overall occupational average, as measured by the U.S. Bureau of Labor Statistics;
  • Unlikely to be automated in the next ten years;
  • Medium- or high-skilled, as measured by the National Compensation Survey
  • Carry a living wage, as measured by MIT’s Living Wage Calculator; and
  • A stepping stone to career advancement.

Requirements for teams:

  • Team must place at least 1,000 individuals in the appropriate job.
  • The planned training time needs to be at least 50% shorter than the currently established training time (which will be defined once specific occupations are chosen).

From launch, teams would have one month to submit their detailed proposals to XPRIZE for approval. Judges would review each proposal and select the 10 most promising: those with the highest likelihood of success. The finalists would receive $X each to carry out their plans.

Phase 2: Training and Job Placement

Finalists have one month to recruit 1,000 workers to participate in their training programs.

Next, finalists would be given 100 days to recruit and train at least 1,000 individuals, who will need to find work in the relevant occupation and retain it for at least 100 days. Entry costs would need to be zero or near zero.

The winning team is the one that will place the most individuals in a job for the longest period of time.

Additional judging criteria:

  • Teams that train for higher-skilled, higher-wage occupations would be ranked higher.
  • Teams that create scalable education tools which can be redeployed to rapidly train for other occupations would be ranked higher.
  • [Do you think the 1,000 trainees goal is audacious enough, too audacious, or just right?](
  • [Should competing teams be able to screen out candidates for their programs based on parameters such as educational attainment and income?](

I think 1,000 seems reasonable. Also, I don’t think teams should select based on educational attainment and income, I think it should be inclusive of anyone interested in workforce training, regardless.

As a person who has built assessments and course curriculum in the past, 1,000 trainees is definitely an audacious goal. Not against the criterion as this is the time we should go for it. But if the curriculum is faulty, we just wasted the time of 1,000 people who desperately need work.

The 2nd question is more tricky. Income definitely shouldn’t be considered. However, if we are going after retaining a certain sector for another sector, we may want to understand their existing skills and abilities to increase their chances of success. For example, if the curriculum is in english, we may want to make sure the participants can read/write english.

With regards to the 1st requirement (training and placing 1,000+ individuals), I suggest we divide it up into phases (i.e.: 100 individuals and prove that the curriculum works, then move to 500 and 1,000).

Again, I’m purely looking at it from the trainee’s perspective when they desperately need a work. It would be a catastrophic failure if you waste the time of 1,000 trainees with a faulty curriculum where you could have found that it was a faulty curriculum by just testing it with 100 trainees. It’s still terrible, but more manageable if you fail with a smaller group.

Question for @Roey

How are we going to quantify the requirement above? Are we going based on the hourly wage? I found this list on NCS. According to the link above, the 50th percentile hourly wage is $15.85. Are we saying any kind of occupation that the hourly wage is greater than that, is an occupation we can consider?

I interpret 1k workers as a local goal, and that the best case scenario is creating a replicable program.

@feskafi - Thank you! This is a big area of focus for us because, while we want to be audacious in our goal, we DEFINITELY want successful outcomes. We’re challenged to find a good baseline for this number. For instance, in-person training programs supported by workforce development boards tend to be in the range of 15-20 participants. We’ve heard of other hybrid training programs that have about 100 students. We assume our competitors/training teams can reach many more participants than this since the instruction is available online.

Do you have any suggestions on how we might benchmark standard online training cohort sizes?

@lancemcneill and @feskafi - Would love your help thinking through this…The reason for the income and educational restrictions is because we do want to ensure that vulnerable populations are the target benefactor of this training and placement program.

Now, it could be argued that we are ALL vulnerable to job loss and economic challenges, especially now in light of the situation. However, the goal of this particular prize is to serve people who have traditionally not been able to find work that pays a living wage and people who do not have a secondary education (i.e. lower-skilled, low education, and lower-earning individuals).

So, even though these two requirements are limiters, they also ensure that the right population is being served through the prize.

With that perspective, do you still think we should have no filters for our cohort criteria?

Yes, I still think there shouldn’t be income or educational restrictions. The more people who are working and prepared for well-paying jobs the better. Perhaps you target certain populations with your marketing and outreach, but I wouldn’t turn anyone away because they had a high-paying job prior to being let go or they’ve already got a Bachelor’s degree.
The displacement and perception of automation/tech displacing jobs can be countered with a more inclusive program.

Also, by keeping it broad and less restrictive, you can learn about the program’s impact on different levels of income, education, etc. Then you can refine the program in later iterations if it makes sense to do so.

I haven’t studied the academic literature on the efficacy and inclusiveness/exclusiveness of different job training programs. Perhaps, you could spend some time doing a literature review to see if there is some evidence to support your decision either way.

I agree with @lancemcneill and the rest that income shouldn’t be a criterion in any way of form.

I’m conflicted on the education criterion though. If we want to place the candidates in jobs that pay (let’s say) $20 or more per hour, some minimum level of education is needed. Usually that minimum level is a high school diploma/GED or vocational school.

If the purpose of this award is to place candidates in a job, let’s agree that the job could be a min wage job. If the purpose is to place them in “higher paying” jobs, we should allow to check for minimum education level (i.e.: GED, vocational school, etc.)

Another criterion we should discuss - do we want to allow candidate filtering based on background check/drug test?

Indeed, it’s something we’re also debating. It raises questions around privacy and time. If we want to retrain/up-skill workers rapidly, I’m not sure it’s feasible.

That’s aside from the question whether it’s even desirable.

I’m curious to know why the goal starts at 1000 people? I would imagine gradually more challenging phases might make more sense? For example Phase 1: 100 people (testing phase Phase 2: 1000 people (beta phase) Phase 3: 10,000 people (final phase). Each phase could carry with it different criteria and prize / reward levels and have different criteria for success. Also one aspect I thought could be looked at under a microscope is the word “skill”. There is a traditional / legacy understanding of what a “skill” is and this understanding is part of what got us into this mess in the first place I think. There are also newer ideas of what a ‘skill’ is which are more forward thinking perspectives. Humanity has been through various stages: hunter gatherer, farmer, factory worker, corporate / military worker, and now… something else? each stage required different and new ‘skills’ and I think that we are facing the start of a new anthropological stage for humanity that begs for a reboot of our idea of relevant skills. My thinking in this regard aligns well with what I read from Seth Godin, Simon Sinek, Brene Brown, Benjamin Zander and Esther Perel.

@AdrianFC Thank you for the feedback. We’ve based our cohort size with the goal of being audacious yet achievable. These criteria were formed based on a mixture of research (we looked at different success rates for online courses on boot camps, MOOCs, etc.), as well as in-depth interviewers with educational and operational experts. Moreover, we’ve recently shifted this threshold to 500 workers.

I do like your idea of a phased competition, it certainly is audacious. Do you think that individual teams would have the capabilities to train 10,000 people within the 30-day timeframe? What do you imagine the attrition rate would be? We’d really love to hear your insights - they’d be very useful in our design work.

Hi Jordan, no I think training 10,000 people for a new line of work in 30 days is not achievable. However, I would be interested in a phased aspect, so for example the extended goal could be to train 500 people in 30 days and to have that 500 people train 5,000 people. If each of the 500 is able to commit to training and mentoring 10 new recruits I think it could be very scalable and allow for exponential growth. One industry in particular that I think could be fascinating is childcare & home schooling. A friend of mine has an interesting startup called Helpr which is a childcare matchmaking service and I’m watching closely to see how they pivot into a remote oriented solution. I think there are so many possibilities here and can imagine stay at home Moms in South Africa (where I am from) being paid by the hour for Zoom based child care / tutoring / home schooling. I think there is a big demand there and also potentially limitless supply of remote childminders & mentors around the world who could make a very quick adaptation into the role. Then it becomes a marketing challenge to build enough awareness and excitement that a match between supply and demand can be sustained.

@AdrianFC: I would also suggest bringing in elements of “how to manage change”, “how to continuously learn”, beyond conventional technical training so that students get access to tools that might become relevant at multiple times in their lives. Almost harking back to the adage: “teach a person one skill and they can find a job, teach a person how to learn and they can find a career”. In the spirit of the audacity that this group aspires to, i suggest we set some loftier goals then purely skill building and job finding. This can be a milestone in the journey but cannot be the destination. I worry that if we do not set our sights ambitiously, we might be failing in our responsibility and the opportunity to make a real meaningful difference as part of this collective. [this comment is repeated from a different thread with @NickOttens, hope it was OK to have repeated here since i thought this was relevant as part of the discussion. Apologies for cross-posting]

@AdrianFC Thank you for elaborating - I think the phased approach is worth discussing with the team. And I applaud the work your friend is doing at Helpr — this is an important line of work, and given the current health crisis, it’s needed more than ever! Do you think Helpr would be interested in entering this XPRIZE competition?

@ukarvind Thank you for your feedback. So, in addition to technical skills, teams should include training in the “soft” skills, such as self-teaching/grit/resilience. Did I get that right? If so, I love that idea. How would you recommend we measure something like this when it comes to judging the teams’ performance?