NLP
March 6, 2026
You are reading in mode. I handwrite to help me think through ideas. It also provides a nice attestation that this blog is human written. The text below is a Gemini-OCR of my handwriting. There may be character mistakes or hallucinations.
NLP is Dead Long Live NLP
Natural language processing had its hay day in 2023, since then, it is a solved field. We can spin long agent-driven webs for handling complicated language tasks which would historically have required tens of different spacy modules, heuristics or other hacks. The key to language processing, it seems, is nothing more than scale. My goal in this piece is not to discuss computational linguistics, or argue over its last stand, indeed any such conversation would be happening over the corpse of the NLP field. CL still will continue to exist with the same purpose it always had: to study linguistics with the aid of the best language models available (neural or otherwise). NLP was always distinct from CL in that it created practical tools that CL could leverage. Thus when I state NLP is dead, take it to mean all the hard coded heuristics, lossy decision trees, and low accuracy semantic matching algorithms have been placed under a moratorium, only surviving for as long as the cost and speed of SLMs prevents large scale deployment.
So NLP is dead, what next? For decades engineers have worked as NLP engines, converting noisy product requirements into polished feature development. Today the role of engineer is replaced in part by language models trained and tasked for system development. These models and associated workflows are sophisticated and capable, leading to veritable productivity amplification in small teams <cite: my vibes>. Working with these tools is a new paradigm of computing, a paradigm I call Natural Language Programming. This is the paradigm of dynamic workflow development, personalized user experiences and customized world content creation. It is a new drastically new paradigm compared with its predecessors of functional and object oriented programming. See attached taxonomy for further clarifications.
the reason for OPS
- like OS, let people see prompts, do analysis, democratize learning
What is OPS
- today: git checked in prompts
- tomorrow: prompt analysis & security & optimization & education
So this is the infancy of NLP, the new paradigm sweeping the developer world. Our data is unstructured our control is experience driven and reactive, and our analysis is crude at best.
The Paradigm of NLP; A partial taxonomy
Each paradigm of computation has its core building block which defines the relationship between data, control flow and access.
Functional Programming naturally relies on functions. With this singular tool, FP languages primarily use recursion as a device for managing control flow. Recursion strongly entails purity & immutability of data. Finally for the purpose of access and developer analysis, Type Systems emerge from FP.
Objects are the core component of Object Oriented Programming. The purpose of objects is to receive and respond to messages with methods. These runtime method invocations are the primary control flow mechanism of OOP. Data is managed and encapsulated within the abstraction of the object. As for access and analysis, inheritance and subtyping provide complicated polymorphic behavior.
Natural Language Programming is built on the generative response. Control flow is managed via extremely late binding similar to method invocation in OOP. In OOP the virtualization is dynamic based on the values at the receiver (the object) and the types of the data (in multi-dispatch languages). NLP expands on this power by modifying the control based on the values of the data as well. This capability is built into the paradigm, not a tool bolted on after the fact. The language itself responds to the data presented to it at instruction- and run-time. A good example of early explorations into this capability is Syntax and kin. Data is Unstructured within NLP, by design. We are free of the strictly typed fetters. Access and analysis are the last dimension of the paradigm which are hitherto thoroughly under explored. Today we have tools gemini-cli, coder, claude code which define our access points. I argue for a proper paradigm we need proper analysis. This is why I am supporting Open Prompt Software (OPS) as a new target to complement open source. With Open prompting we can begin to develop prompt analysis techniques and tooling <cite: PromptSet/> which is woefully misunderstood.
A taxonomy of Prompts:
User prompts: these are conversational messages sent to language models for the purpose of receiving user responses. This is the most commonly studied prompt category today, and what you see in DevGPT or MMMLU or ShareGPT etc.
Developer Prompts: these define control flow structures within programs. DSPy calls them part of the Language Program (LP). Examples include parsing user input data and forking control based on the request. Simple concrete examples from my work at ALL3D include selecting the best candidate product angle or apply specific heuristics for rug-category images. Dev Prompts represent a core unit of LP but are not necessary for NLP. NLP, like FP and OOP is a way of creating systems, LP is a type of system.
Program Prompts: These are prompts which define a program. They are the specification and requirements. They present as conversation, unit tests, brown field repositories and more. Here the goal is to create or modify a system. We collectively have very little data on what Program prompts work beyond anecdotal data from coworkers or forums. The only ones who might be able to learn at scale are the companies selling the service, and they are incentivized to maintain their secret performance methods.