
CEO Mati Staniszewski says everyone will vibe code. Non-technical teams need an embedded engineer first. The move signals a shift in AI-native org design.
At a Sequoia Capital event, ElevenLabs CEO Mati Staniszewski laid out a staffing principle that reorders how AI-native companies build product: embed an engineer in every non-technical team. The reason, he said, is that “everybody will be vibe coding.” Before that can happen, someone needs to show them how. The statement signals a practical shift from treating AI coding tools as engineer-only territory to making them a utility for marketing, sales, and operations staff.
The voice AI startup, known for its realistic speech synthesis, is now applying the same speed-to-product thinking to its internal workflow. By placing an engineer inside each non-engineering function, ElevenLabs aims to let domain experts use natural-language coding tools to build internal tools, automate workflows, and iterate on product features without waiting for a centralized engineering queue.
The term vibe coding describes using large language models to generate code from plain-English instructions, lowering the barrier to software creation. Staniszewski’s assertion that everyone will do it implies the technology is mature enough for broad adoption. The missing piece is not the AI model; it is the organizational scaffolding. An embedded engineer provides the initial setup, prompt engineering guardrails, and quality checks that turn a powerful yet unpredictable tool into a reliable business lever.
This model, if it spreads, could accelerate demand for the infrastructure that powers AI coding assistants. Microsoft’s GitHub Copilot, already used by over 1.8 million developers, would see expanded usage if non-technical teams begin adopting similar tools. The compute required to run these models at scale flows back to GPU providers, a dynamic that reinforces NVIDIA’s position as the primary hardware supplier for AI workloads. The embedded-engineer approach also points to rising demand for AI orchestration tools that let non-engineers interact with models safely, a market that includes startups and cloud providers.
ElevenLabs’ move is a microcosm of a larger trend: AI-native startups are reorganizing around AI fluency rather than traditional engineering hierarchies. If the embedded-engineer model proves effective, it could become a template for enterprise adoption of generative AI coding tools. Public companies that sell AI development platforms–including Microsoft, GitLab, and others–stand to benefit from a broader user base that extends beyond professional developers. The shift also highlights that AI still requires human oversight, which could increase demand for AI engineering talent and for platforms that simplify AI integration for non-technical users.
For investors tracking the AI infrastructure buildout, ElevenLabs’ staffing decision is a real-world signal that the next phase of AI productivity will depend on organizational design, not just model capability. The startup’s product velocity over the next few quarters will serve as a case study. Watch for similar announcements from other venture-backed AI firms and for enterprise pilot programs that embed engineers in business units. The success or failure of this model will help determine whether vibe coding becomes a mainstream productivity driver or remains a niche experiment.
Drafted by the AlphaScala research model and grounded in primary market data – live prices, fundamentals, SEC filings, hedge-fund holdings, and insider activity. Each story is checked against AlphaScala publishing rules before release. Educational coverage, not personalized advice.