TL;DR
OpenAI has asked third-party contractors to submit actual files and task descriptions from current or prior jobs so the company can measure AI agents against human performance. Contractors are told to remove personally identifiable and proprietary information, but legal and trade-secret risks remain a concern.
What happened
Records reviewed by WIRED show OpenAI recruited outside contractors to supply real-world assignments and the outputs they produced in order to evaluate next-generation AI agents. The company’s materials ask participants to convert multi-hour or multi-day pieces of work into discrete tasks that include both the original request and the deliverable, and to upload concrete files — for example, Word documents, PDFs, slide decks, spreadsheets, images or code repositories. Contractors may also submit fabricated examples that mirror realistic responses. OpenAI’s instructions repeatedly advise removing personal data, proprietary content and other confidential material; one internal file references a ChatGPT feature labelled “Superstar Scrubbing” that offers guidance on scrubbing. OpenAI and the training-data firm Handshake AI declined to comment. Legal experts quoted in the reporting warned that relying on contractors to identify and remove secrets could leave AI labs exposed to trade-secret claims and place contractors at risk of breaching prior nondisclosure obligations.
Why it matters
- Using real workplace deliverables helps create human performance baselines that can be used to benchmark AI agents across occupations.
- Relying on contractors to scrub sensitive information shifts responsibility for confidentiality and may not prevent trade-secret disclosure.
- If confidential or personal data slips through, AI companies and contractors could face legal exposure and reputational harm.
- The demand for higher-quality, job-specific data has created a growing market for skilled contractors and third-party data firms.
Key facts
- OpenAI asked contractors to describe tasks they performed and to upload the actual files produced in response to task requests, not summaries.
- Accepted file types include Word docs, PDFs, PowerPoint slides, Excel sheets, images and code repositories.
- The company told contributors to remove personal information, proprietary or confidential data and material nonpublic information before uploading.
- Project documents include an example from a luxury concierge role: a two-page draft itinerary for a 7-day yacht trip to the Bahamas.
- One internal file references a ChatGPT tool called “Superstar Scrubbing” to help redact confidential information.
- OpenAI and Handshake AI declined to comment to WIRED about the program.
- An intellectual property attorney warned the approach could lead to trade-secret misappropriation claims and that contractors might breach former employers’ NDAs.
- AI labs increasingly hire contractor networks and specialist firms — named examples in the reporting include Surge, Mercor and Scale AI — to source higher-quality data.
- Handshake was reported as valued at $3.5 billion in 2022; Surge was reportedly discussing a $25 billion valuation in fundraising talks last summer.
What to watch next
- Whether any legal actions or trade-secret claims arise from files submitted under this program (not confirmed in the source).
- If OpenAI adopts additional internal review or verification steps to detect residual confidential information before use (not confirmed in the source).
- Whether employers or regulators pursue enforcement related to contractors sharing work outputs from prior jobs (not confirmed in the source).
Quick glossary
- AI agent: A software system designed to perform tasks or make decisions on behalf of a user, often by interacting with data and other tools.
- Personally identifiable information (PII): Data that can be used to identify a specific individual, such as names, contact details, social security numbers or other private identifiers.
- Trade secret misappropriation: The unauthorized acquisition, disclosure or use of a company’s confidential business information that provides a competitive advantage.
- Nondisclosure agreement (NDA): A contract that restricts sharing of specified confidential information between parties.
Reader FAQ
Did OpenAI ask for actual files from contractors’ past jobs?
Yes. OpenAI’s materials requested concrete outputs — documents, slides, spreadsheets, images or repositories — rather than summaries.
Were contractors instructed to remove personal or confidential information?
Yes. The instructions repeatedly directed contributors to remove or anonymize personal information, proprietary or confidential data and material nonpublic information.
Has OpenAI said how it will use the submitted files?
The documents indicate the files are for evaluating model performance against human baselines; broader use cases were not detailed in the source.
Did OpenAI respond to requests for comment?
OpenAI and Handshake AI both declined to comment, according to the reporting.
Could contractors or OpenAI face legal problems for sharing these files?
An intellectual property lawyer cited in the report warned there is legal risk from trade-secret misappropriation or NDA violations, though specific legal actions were not reported.

WILL KNIGHT MAXWELL ZEFF ZOË SCHIFFER BUSINESS JAN 9, 2026 8:11 PM OpenAI Is Asking Contractors to Upload Work From Past Jobs to Evaluate the Performance of AI Agents To…
Sources
- OpenAI Is Asking Contractors to Upload Work From Past Jobs to Evaluate the Performance of AI Agents
- Building practical AI agents for real businesses
- Can AI do your job? See the results from hundreds of tests.
- New tools for building agents
Related posts
- Grok Is Being Used to Mock and Strip Women in Hijabs and Sarees
- AI-assisted solution addresses Erdős Problem #728, with Lean verification
- 9to5Mac Daily — Jan 9, 2026: Apple exec pay, shareholders meeting, ChatGPT Health