<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Synthesis Company]]></title><description><![CDATA[The Synthesis Company]]></description><link>https://newstand.zenbase.ai</link><generator>Substack</generator><lastBuildDate>Wed, 06 May 2026 10:44:27 GMT</lastBuildDate><atom:link href="https://newstand.zenbase.ai/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Cyrus]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[synthesis@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[synthesis@substack.com]]></itunes:email><itunes:name><![CDATA[Cyrus]]></itunes:name></itunes:owner><itunes:author><![CDATA[Cyrus]]></itunes:author><googleplay:owner><![CDATA[synthesis@substack.com]]></googleplay:owner><googleplay:email><![CDATA[synthesis@substack.com]]></googleplay:email><googleplay:author><![CDATA[Cyrus]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Let's talk about Structured Prompting]]></title><description><![CDATA[JSON, XML, and a new primitive for AI agents you've been missing out on]]></description><link>https://newstand.zenbase.ai/p/lets-talk-about-structured-prompting</link><guid isPermaLink="false">https://newstand.zenbase.ai/p/lets-talk-about-structured-prompting</guid><dc:creator><![CDATA[Cyrus]]></dc:creator><pubDate>Wed, 30 Jul 2025 01:14:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!1Uv0!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fba1a6261-96c5-44cc-a4b3-9c241eef6d61_656x656.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2>The Recent Buzz on X</h2><p>Over the past week, X has been ablaze with discussions on structured prompting techniques for large language models. It started with a surge in posts hyping "JSON prompting" as a revolutionary skill for 2025, promising to transform LLMs like ChatGPT, Claude, and Gemini into reliable, hallucination-free agents. Threads shared copy-paste templates for tasks like content generation and data extraction, emphasizing how JSON's key-value structure provides clarity and consistency. Influential voices like Theo predicted it would dominate conversations, and viral posts from creators like Sehaj Singh <a href="https://x.com/heysehajsingh/status/1949815825210409200">amplified the trend</a>, garnering thousands of likes and reposts.</p><p>But the discourse quickly evolved into a debate, with critics pointing out JSON's limitations&#8212;like token inefficiency and noise from whitespace&#8212;and advocating for alternatives. XML prompting emerged as a strong contender, with users claiming it's superior for creating clear boundaries in prompts. Posts from <a href="https://x.com/mattshumer_/status/1949968378275275230">Matt Shumer urged ditching JSON entirely</a>, while <a href="https://x.com/jobergum/status/1950091591470707074">Jo Kristian Bergum called XML the "cheat code"</a> used by LLM companies themselves. This shift highlights a growing consensus that structured formats aren't just hype; they're essential for scaling AI reliability, though the community remains divided on the best approach.</p><h2>JSON vs. XML Prompting: Why XML Wins</h2><p>OpenAI&#8217;s <a href="https://x.com/noahmacca/status/1949541371469254681">Noah MacCallum</a> set the record straight:</p><ul><li><p>I&#8217;ve actually done experiments on this and markdown or xml is better</p></li><li><p>&#8220;Models are trained on json&#8221; &#8594; yes they&#8217;re also trained on a massive amount of plain text, markdown, etc</p></li><li><p>JSON isn&#8217;t token efficient and creates tons of noise/attention load with whitespace, escaping, and keeping track of closing characters</p></li><li><p>JSON puts the model in a &#8220;I&#8217;m reading/outputting code&#8221; part of the distribution, not always what you want</p></li></ul><p>My theory is that XML prompting excels because it explicitly delimits the start and end of sections with semantic tags, creating what we might call "<strong>semantic attentional boundaries</strong>." These tags&#8212;&lt;task&gt; for objectives, &lt;context&gt; for background&#8212;act as clear fences in the model's attention mechanism, allowing it to compartmentalize information more effectively than JSON's nested braces. Anthropic's documentation on Claude even recommends XML for this reason, noting improved parsing and reduced errors.</p><h2>Introducing LLML: The Compositional Primitive for AI Contexts</h2><p><a href="https://github.com/zenbase-ai/llml">LLML</a> is a tool I built to make writing XML easy, LLML transforms nested data structures into optimized markup. It's available in Python, TypeScript, Rust, and Go, under MIT license. You compose data structures and LLML handles converting it to XML (or JSON, if you really want to).</p><p>Here&#8217;s a quick TypeScript example:</p><pre><code><code>import { llml } from "@zenbase/llml";

const researchPrompt = llml({
  role: "Expert Researcher",
  query: "Impact of quantum computing on cryptography",
  sources: [
    "Academic papers from arXiv",
    "Recent news from TechCrunch",
    "Expert opinions from IEEE"
  ],
  steps: [
    "Retrieve and summarize key documents",
    "Identify breakthroughs and risks",
    "Synthesize recommendations"
  ],
  output_format: "Structured report with sections: Summary, Analysis, Future Implications",
  guardrails: [
    "Cite sources accurately",
    "Avoid unsubstantiated claims"
  ]
});

// Outputs VibeXML:
// &lt;role&gt;Expert Researcher&lt;/role&gt;
// &lt;query&gt;Impact of quantum computing on cryptography&lt;/query&gt;
// &lt;sources&gt;
//   &lt;sources-1&gt;Academic papers from arXiv&lt;/sources-1&gt;
//   &lt;sources-2&gt;Recent news from TechCrunch&lt;/sources-2&gt;
//   &lt;sources-3&gt;Expert opinions from IEEE&lt;/sources-3&gt;
// &lt;/sources&gt;
// ... (and so on)
</code></code></pre><p>This setup makes the agent prom modular: swap sources or steps without reformatting everything.</p><p>And a Python one:</p><pre><code><code>from zenbase_llml import llml

agent_prompt = llml({
    "role": "Agentic RAG Analyzer",
    "initial_query": "Optimize supply chain logistics with AI",
    "retrieval": {
        "database": "Vector store of industry reports",
        "top_k": 5
    },
    "reasoning_loop": [
        "Evaluate retrieved docs for relevance",
        "Refine query if needed",
        "Generate final synthesis"
    ],
    "tools": ["search_api", "summarizer"],
    "guardrails": [
        "Limit iterations to 3",
        "Ensure cost under $0.05"
    ]
})

# Outputs VibeXML:
# &lt;role&gt;Agentic RAG Analyzer&lt;/role&gt;
# &lt;initial_query&gt;Optimize supply chain logistics with AI&lt;/initial_query&gt;
# &lt;retrieval&gt;
#   &lt;database&gt;Vector store of industry reports&lt;/database&gt;
#   &lt;top_k&gt;5&lt;/top_k&gt;
# &lt;/retrieval&gt;
# ... (etc.)

</code></code></pre><p>LLML isn't just formatting&#8212;it's a declarative approach that makes prompt composition a breeze. Check the GitHub repo for more: <a href="https://github.com/zenbase-ai/llml">github.com/zenbase-ai/llml</a>. If you're building AI systems, this is a primitive you've been missing.</p><p></p><p>Pro tip: Format your complex tool call results as XML dynamically with LLML &#128521;</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://newstand.zenbase.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Zenbase Newstand! </p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Porting DSPy Optimizers — While Remembering the Bigger Picture]]></title><description><![CDATA[Or, How DSPy's Optimizers are the Trojan Horse of Engineering Discipline in LLM Engineering]]></description><link>https://newstand.zenbase.ai/p/porting-dspy-optimizers-while-remembering</link><guid isPermaLink="false">https://newstand.zenbase.ai/p/porting-dspy-optimizers-while-remembering</guid><dc:creator><![CDATA[Amir Mehr]]></dc:creator><pubDate>Wed, 14 May 2025 03:29:58 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!lgVs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!lgVs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!lgVs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 424w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 848w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1272w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!lgVs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png" width="456" height="450.59288537549406" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dbc018af-84b7-4060-bb9b-720d732236cf_506x500.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:500,&quot;width&quot;:506,&quot;resizeWidth&quot;:456,&quot;bytes&quot;:384930,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://newstand.zenbase.ai/i/163510834?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!lgVs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 424w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 848w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1272w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>A few days ago, I mentioned on <a href="https://x.com/ammirsm/status/1920227539764850867">X</a> that if 300 people wanted to see DSPy optimizers integrated with other frameworks (like LangChain or Crew AI), I'd build it publicly. While I didn't get the 300 likes I was looking for, we received attention from several deeply interested developers who saw the potential value immediately. This kind of focused interest from the right people matters more than raw numbers, and it reinforces my belief in what I'm writing here. People are excited about DSPy's advanced optimization flows&#8212;and yes, <strong>optimizers</strong> are a big draw.</p><p>However, I want to emphasize something: <strong>DSPy is not just about optimizers.</strong></p><p>There's a whole philosophy here about how to build "compound AI systems" in a way that's future-proof, evaluation-driven, and not locked in to one model. If you only think of DSPy as a library for fancy prompt/RL optimization, you're missing a large part of the story.</p><h2><strong>The Philosophy Behind DSPy</strong></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!30Wl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!30Wl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 424w, https://substackcdn.com/image/fetch/$s_!30Wl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 848w, https://substackcdn.com/image/fetch/$s_!30Wl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!30Wl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!30Wl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg" width="386" height="541.172" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:701,&quot;width&quot;:500,&quot;resizeWidth&quot;:386,&quot;bytes&quot;:95110,&quot;alt&quot;:&quot;Four-panel galaxy-brain meme showing the mental leap from plain text prompts to DSPy&#8217;s structured signatures and modules, culminating in a glowing cosmic brain.&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newstand.zenbase.ai/i/163510834?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Four-panel galaxy-brain meme showing the mental leap from plain text prompts to DSPy&#8217;s structured signatures and modules, culminating in a glowing cosmic brain." title="Four-panel galaxy-brain meme showing the mental leap from plain text prompts to DSPy&#8217;s structured signatures and modules, culminating in a glowing cosmic brain." srcset="https://substackcdn.com/image/fetch/$s_!30Wl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 424w, https://substackcdn.com/image/fetch/$s_!30Wl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 848w, https://substackcdn.com/image/fetch/$s_!30Wl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!30Wl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb228f157-b7dd-4d63-bb21-55cf8f7ace8f_500x701.jpeg 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Leveling up from &#8216;prompt hacking&#8217; to a compound AI system mindset.</figcaption></figure></div><p><a href="https://twitter.com/lateinteraction">Omar</a> famously laid out <a href="https://x.com/lateinteraction/status/1921565300690149759">5 &#8220;core bets&#8221;</a> that guide DSPy's design:</p><p><strong>1. Information Flow is the main bottleneck.</strong></p><p>As foundation models improve, the key becomes (a) asking the right question, and (b) providing all necessary context. DSPy addresses this with flexible control flow (&#8220;compound AI systems&#8221;) and <strong>Signatures</strong> for structured I/O.</p><blockquote><p><em><strong>Why I agree:</strong></em> Too often, we get distracted by &#8220;prompts&#8221; themselves. But what truly matters is whether each part of the LLM pipeline has the right data or context at the right time. DSPy's approach to laying out how data flows between modules is crucial for robust systems.</p></blockquote><p><strong>2. Interactions with LLMs should be Functional and Structured.</strong></p><p> Rather than treat everything as chat messages or big text prompts, we should define them as <em>functions</em> with well-defined inputs, outputs, and instructions.</p><blockquote><p><em><strong>Why I agree:</strong></em> If you think &#8220;prompt = a random string&#8221;, you end up with a constant string-tweak headache. But if you see &#8220;prompt = function signature with inputs/outputs&#8221;, it's a lot cleaner&#8212;and more amenable to optimization.</p></blockquote><p><strong>3. Inference Strategies should be Polymorphic Modules.</strong></p><p> Techniques like CoT, ReAct, ToT, or other inference patterns should be easily swappable, just like layers in PyTorch. Each Module is generic and can apply to any DSPy Signature.</p><blockquote><p><em><strong>Why I agree:</strong></em> We want to re-use these patterns across tasks, not re-implement them from scratch. It's the difference between a well-architected library and a tangle of scripts.</p></blockquote><p><strong>4. Separate the specification of AI software behavior from learning paradigms.</strong></p><p> Whether you do fine-tuning, RL, or iterative prompt optimization, your base definitions (Signatures and Modules) shouldn't need rewriting.</p><blockquote><p><em><strong>Why I agree:</strong></em> Historically, every time we switched from LSTMs to Transformers or from BERT to GPT-3, everything broke. DSPy wants to solve that by letting us keep the same "spec" of what the system is supposed to do, while the "how" (or the training approach) can evolve over time.</p></blockquote><p><strong>5. Natural Language Optimization is powerful.</strong></p><p>Combining fine-tuning with high-level language instructions can be extremely sample-efficient and more intuitive than purely numeric RL.</p><blockquote><p><em><strong>Why I agree:</strong></em> We've seen in practice how "prompt-level" interventions, guided by clear instructions and examples, can drastically reduce the trial-and-error cycle. This has been proven by frameworks like MIPROv2, SIMBA, and other research that merges text-based feedback with gradient-based methods.</p></blockquote><p>In a sense, <strong>DSPy</strong> has a "compiler" for declarative compound AI systems. You specify Signatures and Modules, and DSPy translates that into the best possible LLM behavior.</p><h2><strong>Why Decouple the Optimizers?</strong></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Kw-x!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Kw-x!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Kw-x!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Kw-x!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Kw-x!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Kw-x!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg" width="500" height="500" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:500,&quot;width&quot;:500,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:53988,&quot;alt&quot;:&quot;Drake meme contrasting the pain of a full framework rewrite with the joy of dropping DSPy optimizers straight into existing stacks.&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newstand.zenbase.ai/i/163510834?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Drake meme contrasting the pain of a full framework rewrite with the joy of dropping DSPy optimizers straight into existing stacks." title="Drake meme contrasting the pain of a full framework rewrite with the joy of dropping DSPy optimizers straight into existing stacks." srcset="https://substackcdn.com/image/fetch/$s_!Kw-x!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Kw-x!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Kw-x!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Kw-x!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca3e93b5-8e1c-4e5d-bb54-4271156f9105_500x500.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Distribution first, full adoption second.</figcaption></figure></div><p>There are two perspectives on DSPy adoption that I wrestle with:</p><p><strong>From a research perspective</strong>, the ideal approach is to build everything in DSPy from scratch. You define your entire compound AI system in a structured, type-safe way, and let the DSPy compiler and optimizers do their job. This provides the cleanest implementation, maximum synergy between components, and the full power of DSPy's abstractions.</p><p><strong>From a practical perspective</strong>, we must acknowledge real-world constraints. Many developers have entire pipelines in frameworks like LangChain, Crew AI, or custom solutions. They might not have the time or resources to rewrite everything around DSPy's abstractions&#8212;yet they'd still love to tap into our <strong>optimizers</strong>. So it makes sense to <strong>decouple</strong> those optimizers for broader use:</p><ul><li><p>It can help more people get immediate improvements without a total workflow rewrite.</p></li><li><p>It addresses real-world distribution challenges: we want DSPy's ideas to spread widely.</p></li><li><p>As more devs apply these optimizers, we can gather <strong>more datasets and benchmark results</strong>, which improves optimization research and DSPy's evolution itself.</p></li><li><p>While decoupled optimizers might not achieve their highest possible accuracy without DSPy's discrete abstractions, they can still provide significant value when integrated with other frameworks.</p></li></ul><p>This &#8220;distribution first, full adoption second&#8221; approach maximizes usage, fosters community contributions, and hopefully nudges teams toward the full DSPy abstractions over time.</p><div class="pullquote"><p>A <strong>broader user base</strong> means <strong>more real-world data</strong>, <strong>more unique tasks</strong>, and <strong>more varied benchmarks</strong>. All of that feeds back into optimization research efforts, letting us <strong>refine</strong> algorithms and evaluate them across a wider spectrum of use cases.</p></div><h2><strong>The Power of Compound AI Abstraction</strong></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!lgVs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!lgVs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 424w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 848w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1272w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!lgVs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png" width="456" height="450.59288537549406" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dbc018af-84b7-4060-bb9b-720d732236cf_506x500.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:500,&quot;width&quot;:506,&quot;resizeWidth&quot;:456,&quot;bytes&quot;:384930,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newstand.zenbase.ai/i/163510834?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!lgVs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 424w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 848w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1272w, https://substackcdn.com/image/fetch/$s_!lgVs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdbc018af-84b7-4060-bb9b-720d732236cf_506x500.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>DSPy stands on the belief: <strong>defining your &#8220;compound AI systems&#8221; with explicit Signatures is key</strong>. By specifying modules' inputs and outputs clearly, you do more than just produce a &#8220;better prompt&#8221;. You define the entire structure of data flow across your AI pipeline.</p><h3><strong>Meeting Developers Where They Are</strong></h3><p>Different projects might rely on different frameworks (or no framework at all), each with unique assumptions. Some agent-based platforms don't expose the entire conversation to the dev. GUI-driven frameworks might store state in ways that don't map cleanly to DSPy Signatures. Others might rely on specialized data flows you can't trivially replicate.</p><p>That's why <em><strong>forcing</strong></em> everyone to use DSPy from the ground up can be a deal-breaker. But if they can use DSPy's optimizers from within their existing environment, they get a taste of the benefits. Hopefully, they'll then consider switching more parts of their code to DSPy's structured approach.</p><h3><strong>The Data-Driven Imperative</strong></h3><p>A big part of DSPy's approach is focusing on <strong>evaluations</strong>. If you want an optimizer, you need an objective function or at least a robust metric. Many LLM devs skip systematic evals, relying on quick &#8220;does this prompt work?&#8221; checks. At a certain scale, it becomes unsustainable to maintain. </p><p>People like <a href="http://x.com/eugeneyan">Eugene Yan</a>, <a href="https://x.com/HamelHusain">Hamel Husain</a>, <a href="https://x.com/sh_reya">Shreya Shankar</a>, and <a href="https://x.com/jxnlco">Jason Liu</a> all emphasize and <strong>(explicitly) educate</strong> folks to look at data, analyze errors, use evals, and iterate based on measurable outcomes. </p><p>They point out how crucial it is to measure each iteration&#8217;s success or failure. Doing it manually is tedious, which is why DSPy aims to make &#8220;Signatures + Evaluations + Optimizers&#8221; a cohesive process from the start.</p><p>There's an additional principle that beautifully complements Omar's bets: <strong>the reinforcement of software and ML engineering fundamentals</strong>. When LLMs first emerged, many developers (understandably) set aside these fundamentals in the excitement of "prompt engineering." We often bypassed clear problem definitions, success metrics, and test datasets. Instead, we relied on subjective judgments: "this prompt feels better than that one."</p><p>Now we're coming full circle. Just as traditional ML requires well-defined problems and loss functions, effective LLM development demands:</p><p>1. Clear problem definitions (what exactly are we trying to solve?)</p><p>2. Measurable success criteria (how do we know if we're improving?)</p><p>3. Test datasets (what examples validate our approach?)</p><p>What's powerful about DSPy is how it <strong>implicitly educates</strong> developers by making these principles part of its workflow. By requiring Signatures and encouraging evaluation functions, DSPy naturally guides developers toward better practices.</p><p>But here's the key insight: <strong>porting optimizers is about distributing these foundational principles to everyone</strong>. Instead of just telling people &#8220;structured AI development is good for you&#8221;, we're saying &#8220;take this optimizer and you'll see immediate improvements&#8212;but you'll need to create datasets and evaluation metrics first.&#8221; It's a Trojan horse for better engineering practices.</p><div class="pullquote"><p>A <strong>broader user base</strong> means <strong>more real-world data</strong>, <strong>more unique tasks</strong>, and <strong>more varied benchmarks</strong>. All of that feeds back into optimization research efforts, letting us <strong>refine</strong> algorithms and evaluate them across a wider spectrum of use cases.</p></div><h2><strong>Common Hurdles (and my Perspective)</strong></h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!_p_j!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!_p_j!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_p_j!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_p_j!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_p_j!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!_p_j!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg" width="503" height="496" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:496,&quot;width&quot;:503,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:84692,&quot;alt&quot;:&quot;Cartoon of Sisyphus struggling to roll a boulder labeled &#8216;Legacy pipeline&#8217; up a hill titled &#8216;Framework migration.&#8217;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://newstand.zenbase.ai/i/163510834?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Cartoon of Sisyphus struggling to roll a boulder labeled &#8216;Legacy pipeline&#8217; up a hill titled &#8216;Framework migration.&#8217;" title="Cartoon of Sisyphus struggling to roll a boulder labeled &#8216;Legacy pipeline&#8217; up a hill titled &#8216;Framework migration.&#8217;" srcset="https://substackcdn.com/image/fetch/$s_!_p_j!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 424w, https://substackcdn.com/image/fetch/$s_!_p_j!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 848w, https://substackcdn.com/image/fetch/$s_!_p_j!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!_p_j!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a88d8f3-73bc-44c8-9a2a-e840fc7ba229_503x496.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Feels like pushing a boulder? Start by bolting on optimizers instead of rebuilding the hill.</figcaption></figure></div><ul><li><p><strong>Porting Production Systems</strong>: If your AI pipeline is in production, switching frameworks is tough. That's why bridging DSPy optimizers is valuable&#8212;you can start using them immediately, no major rewrite.</p></li><li><p><strong>Future-Proofing</strong>: People worry about picking the &#8220;wrong&#8221; abstraction. But DSPy is designed to evolve as LLMs evolve, so you're not forced to keep rewriting all your prompts when new models appear.</p></li><li><p><strong>Overhead of Structured Signatures</strong>: Defining inputs/outputs might feel like extra work. But it pays off when you run repeated optimization cycles&#8212;everything is cleaner, more robust, and easier to track.</p></li><li><p><strong>Abstraction Diversity</strong>: While we should push on defining the best abstraction for people in DSPy too, there may be use cases where developers lean toward different abstractions. The key insight is that regardless of the chosen abstraction, Natural Language Optimization remains a powerful paradigm of learning that can benefit all approaches.</p></li></ul><h2><strong>The Vision: From Optimizers to Ecosystem</strong></h2><p>Ultimately, the dream is that DSPy's structured approach in thinking becomes a standard for building and thinking about compound AI systems. Opening up the optimizers is a <strong>practical</strong> step toward that vision:</p><ul><li><p><strong>Immediate Gains</strong>: Devs see better performance, even if they only adopt a fraction of DSPy.</p></li><li><p><strong>Broader Usage &#8594; More Data</strong>: As more people use DSPy optimizers, they generate valuable datasets and benchmarks. These datasets open up avenues for people to contribute and share their insights, directly feeding into future optimization research.</p></li><li><p><strong>Full DSPy Adoption</strong>: Over time, teams realize that adding Signatures and Modules (i.e., truly structuring their AI pipelines) unlocks even greater synergy and maintainability.</p></li></ul><h2><strong>Building in Public &amp; Join the Beta</strong></h2><p>I'm committed to developing these integrations in the open&#8212;sharing code, design decisions, and prototypes publicly. This isn't just about transparency; it's about creating better tools through community feedback and real-world testing.</p><p>Our priorities for this effort are:</p><ol><li><p><strong>Developing in the open</strong>: All code, design decisions, and integration approaches will be publicly available.</p></li><li><p><strong>Listening to feedback</strong>: Your real-world use cases and challenges will directly shape these integrations.</p></li><li><p><strong>Emphasizing data/evals</strong>: We'll help you define metrics and gather evaluation data as you implement these optimizers.</p></li></ol><div class="pullquote"><p>I'd love to talk with you directly about using these integrations. If you're interested in being part of this, we opened up a <strong><a href="https://zenbase.typeform.com/to/pYLaG9BJ">Slack</a> </strong>server to chat. Join us!</p></div><h2><strong>In Summary</strong></h2><ul><li><p><strong>DSPy goes beyond optimizers</strong>&#8212;it's a holistic, evaluation-driven compiler for compound AI systems that future-proofs your LLM workflow with a philosophy-first design centered on structured Signatures, evaluation-centric loops, and modular inference; the optimizers are just one expression of that mindset.</p></li><li><p><strong>Quick wins, long-term discipline</strong>&#8212;decoupled optimizers provide instant gains while acting as a <em><strong>Trojan horse</strong></em> for good engineering: you must gather datasets, define metrics, and measure progress.</p></li><li><p><strong>More users &#8594; more data &#8594; better research</strong>&#8212;every new integration expands the pool of real-world tasks, fueling benchmark creation and faster innovation in optimization algorithms.</p></li><li><p><strong>Abstraction-agnostic power</strong>&#8212;whatever framework you favor (DSPy, LangChain, CrewAI, or something custom), Natural Language Optimization remains a portable paradigm that DSPy makes easy to adopt.</p></li></ul><p>Start with the optimizers, stay for the structure. Plug them into your stack today, collect data &amp; evaluations, and you'll already be on the path toward the full DSPy approach&#8212;where Signatures, Modules, and evaluation-first design unlock deeper, longer-term returns.</p><div class="pullquote"><p>I'd love to talk with you directly about using these integrations. If you're interested in being part of this, we opened up a <strong><a href="https://zenbase.typeform.com/to/pYLaG9BJ">Slack</a> </strong>server to chat. Join us!</p></div>]]></content:encoded></item></channel></rss>