9/7/25
The story of “The Genie and the Three Wishes” is an extremely familiar anecdote that has circulated through civilizations for centuries. A person stumbles upon a genie and is granted three wishes; however, these wishes must be carefully curated, as the genie will interpret the wish literally if the prompt is not well thought out, leading to undesirable outcomes. Though it’s safe to say that, in reality, it is unlikely one will be approached by a wish-granting genie, versions of this situation can be seen in day-to-day life. For example, when dealing with the ever-evolving AI chatbots, one attempts to craft the ideal prompt to ensure an optimal response using as few queries as possible.
In fact, the process of drafting prompts for generative AI has become a vital part of the initiative to integrate AI into the workforce. A specific role has been created to maximize the potential for AI adoption: the prompt engineer. A prompt engineer is tasked with designing precise inputs for generative AI models by integrating structure and constraints into the prompt.
Some consider prompt engineering to be a gateway occupation for people with non-technical backgrounds who wish to jump onto the AI bandwagon. At a glance, it seems like a golden opportunity, as an entry-level prompt engineer in the United States can earn around a six-figure salary without a formal degree. However, the longevity of this role is being questioned by experts, and the majority agree that this job has an expiration date or is already considered dead.
Numerous reasons fuel this debate, but one of the most significant lies in the fact that no matter how perfect a prompt is, a model that has not been trained on the relevant knowledge or data can not return an appropriate response.
Context engineering was created to resolve the limitations that exist in basic Large Language Models (LLMs) and prompt engineering. Context engineering bridges the knowledge gap and transforms a general LLM into a more resourceful, situation-aware system. Context engineering uses techniques such as Retrieval-Augmented Generation (RAG), which gives LLMs access to external documents in addition to maintaining conversation history and refined summarization methods to ensure that only the most relevant information is included in the response.
Context and prompt engineering have worked alongside one another for the last few years, creating the perfect pairing for LLM advancements. Today, experts are emphasizing the demand for context engineering over prompt engineering.
Early generations of LLMs could not parse requests written in substandard quality writing. To address this issue, prompt engineers were carefully selected to write clean and reliable prompts that would help optimize and train these models. Modern models don’t require the amount of hand-holding these older models did. Context engineers have fixed this gap, as it has become easier for people with minimal prompting experience to interact with models and obtain a desired output.
A comparison could be made between today’s prompt engineering and the roughly 30-year-old internet researcher to show how, with every new technology, temporary roles emerge to bridge the gap until the systems evolve and the average user can manage on their own. The internet researcher was tasked with the responsibility of mastering early search engines using Boolean operators and advanced query methods to access reliable data needed by professionals across a range of disciplines. Now, approximately 92% of jobs require digital literacy, and searching the web is a common activity done by 68.7% of the world’s population. As models evolve and become more accessible worldwide, it is likely that, much like the internet before them, AI literacy will be added to job requirements and more people will become proficient in using LLMs; consequently, making prompt engineering in its current form obsolete.
While prompt engineering is still an integral part of maximizing the potential of AI models, unless it evolves alongside the models used in training, there won’t be much room for growth, and the role may diminish as the technology becomes more intuitive and accessible to the average user.
