I've been traveling frequently across Asia, meeting the people building the infrastructure behind the AI boom. Most conversations center on models, benchmarks, and latency. My conversation with Yongle Yang in Tokyo was about something different - and arguably more important: who actually gets to build AI applications, and what's still standing in their way.
Yang is a Solution Architect at Dify, and the answer he's working toward is straightforward: everyone should. That conviction is reshaping how thousands of developers and non-developers alike are shipping AI products right now.
Making AI Buildable - The Philosophy Behind the Platform
Yang doesn’t describe Dify as a tool - he describes it as a way to build. The name says it clearly - Do It For Yourself - rooted in their CEO’s belief that you only understand something when you get your hands dirty. That philosophy isn’t branding. It’s the design principle behind every feature of the platform.
Dify is an LLM Ops platform that brings together every component required to build a production-grade AI application - prompts, knowledge bases, context, code logic, API integrations, and monitoring - into one place. But this isn’t just about convenience.
The biggest barrier to AI adoption inside most companies today isn’t talent or budget. It’s technical onboarding. “Companies are really having a hard time doing technical onboarding for any kind of AI usage,” Yang said. His response is simple: “I can give anyone a 10-minute lesson on what it takes to build an AI chatbot with Dify.” That’s not a pitch - it’s a pattern he’s seen repeatedly across clients throughout the region.
If Building AI Feels Complex, You’re Looking at the Wrong Layer
At the core of Dify is a workflow canvas - a visual interface where developers can build sophisticated AI pipelines by connecting nodes. Each node handles a specific function: HTTP requests, Python or JavaScript execution, LLM interactions, and knowledge base queries. No deep infrastructure knowledge required.
So what does a developer actually need to think about?
"They just need to worry about the whole picture of their project," Yang said. The components handle themselves. And for teams that want to go further, Dify's Marketplace allows secure, reliable API integrations with any third-party tool - from Microsoft and Google to platforms like Agora’s Conversational AI Engine, which Yang tested himself during the week of our conversation.
His feedback was direct: "Compared to building TTS and STT interaction natively inside Dify, the Agora integration is just on another level. I built a real-time chatbot with private knowledge in five minutes, and it works on mobile, on desktop - everything is fast and accurate."
Single Agent or Multi-Agent: The Architecture Question
One of the most practical parts of our conversation was Yang’s take on agent design—a question every developer building with LLMs eventually runs into. His answer is clear; he's a multi-agent believer.
Why not just make one smart agent and call it done?
"If you want an office assistant, you're going to need a multi-agent system," he said. The architecture he favors uses a master agent that interprets user intent and distributes tasks to specialized sub-agents - one for email drafting, one for scheduling, one for data retrieval. Each sub-agent is purpose-built and reliable at its specific function. The master agent handles orchestration. Dify handles the synchronization automatically, driven by the LLM's own judgment.
The insight here is simple: single-agent and multi-agent systems aren’t competing ideas. Multi-agent systems are just single-agent thinking, scaled and coordinated. The complexity doesn’t disappear - it’s abstracted away. The developer just defines the workflow.
From Prompt to Product - Just Describe It
The most forward-looking part of the conversation focused on where Dify is headed next. Yang described a feature currently in development: a workflow prompt IDE in which a developer types a use case in natural language - specifying the input and the desired output - and a fully configured workflow canvas is generated automatically. "You just type your use case, and everything integrates together. A fully ready workflow."
The timeline: two to three months. And if it ships as described, it changes the conversation entirely - from "how do I build this" to "what should I build next."
Two other upcoming features worth noting: Webhook and Trigger support, enabling workflows to activate automatically from external events, and continued expansion of the Marketplace with more third-party integrations baked in from day one.
Build AI That Replaces You
Yang closed with a piece of advice that sounds counterintuitive - until you think about it.
"You should always try to build a project that can replace yourself."
Find the repetitive, time-consuming parts of your own workflow. Build the AI to handle them. As the models improve, they get better at your job. That frees you to move up - to more creative work, to higher-leverage problems, to things that actually require human judgment. Then automate those too.
It's a compounding loop, and it's the mindset that separates developers building ahead of this shift from those reacting to it.
There's a lot more where this came from - watch the full conversation.
Watch the full conversation: https://www.youtube.com/watch?v=EoXAPh7HGCg


