





















































Together with Growth School & Infinite Uptime
The AI race is getting faster & dirtier day by day. Things we could never have imagined are happening.
Meta just poached OpenAI’s 4 top researchers …….
So if you’re not learning AI today, you probably won't have a job in the next 6 months.
That’s why, you need to join the 3-Day Free AI Mastermind by Outskill which comes with 16 hours of intensive training on AI frameworks, building with sessions, creating images and videos etc. that will make you an AI expert. Originally priced at $895, but the first 100 of you get in for completely FREE! Extended 4th of july SALE! 🎁
In the 5 sessions, you will:
All by global experts from companies like Amazon, Microsoft, SamurAI and more. And it’s ALL. FOR. FREE. 🤯 🚀
$5100+ worth of AI tools across 3 days — Day 1: 3000+ Prompt Bible, Day 2: Roadmap to make $10K/month with AI, Day 3: Your Personal AI Toolkit Builder.
Sponsored
Welcome to BIPro #104 ~ rethinking enterprise intelligence for the GenAI era.
This issue marks a pivotal moment in how we think about intelligence, not just artificial, but organizational. In our featured deep dive, Rahul Singh, Data Science Manager at Adobe, explores the fast-emerging discipline of Context Engineering in his groundbreaking piece:
“Beyond Prompts: The Rise of Context Engineering”.
As enterprises transition from assistive chatbots to autonomous AI agents, Rahul makes a compelling case: context, not prompt design, is the true foundation of effective, scalable AI. Drawing on real-world insights and examples from McKinsey, LangChain, and organizations’ own data infrastructure, this piece offers a practical roadmap for building AI systems that understand, reason, and act with enterprise-grade intelligence.
Whether you're an analytics leader, a data professional, or building GenAI infrastructure, this is one article you don’t want to miss.
Also in This Issue:
BIPro 104 is your guide to the future of data platforms, context-aware AI, and modern orchestration. As GenAI systems evolve beyond simple chat prompts, the need for deeply integrated, domain-specific context is the next critical unlock. Let's build it, intelligently, and together.
✉️ Have tips or tools to share? Reply and contribute to our next edition.
Cheers,
Merlyn Shelley
Growth Lead, Packt
PlantOS Manufacturing Intelligence is powering the next era of industrial performance — delivering 99.97% equipment availability and up to 2% energy savings per unit produced. From steel to cement, manufacturers worldwide are turning fragmented data into confident decisions across every layer of production — from parameter to plant to global scale.
Sponsored
Why context engineering is the next frontier in building smarter, more reliable AI systems.
Written by Rahul Singh, Data Science Manager @Adobe.
Over my seven-plus-year career in data science, working on projects ranging from customer-value measurement to product analytics and personalization, one question has remained constant through it all:Do we have the right data, and can we trust it?
With the rapid rise of Generative AI, that question hasn’t disappeared; it’s become even more urgent. As AI systems evolve from proof-of-concept assistive chatbots to autonomous agents capable of reasoning and acting, their success increasingly depends not on how complex or powerful they are, but on how well they understand the context in which they operate.
In recent weeks, leaders like Tobi Lütke (CEO of Shopify), Andrej Karpathy (former Director of AI at Tesla), and others have spotlighted this shift. Lütke’s tweet was widely reshared, including by Karpathy, who elaborated on it further. He emphasized that context engineering is not about simple prompting, but about carefully curating, compressing, and sequencing the right mix of task instructions, examples, data, tools, and system states to guide intelligent behavior. This emerging discipline, still poorly understood in most organizations, is quickly becoming foundational to any serious application of generative AI.
This growing attention tocontext engineeringsignals a broader shift underway in the AI landscape. For much of the past year,prompt engineeringdominated the conversation, shaping new job titles and driving a surge in hiring interest. But that momentum is tapering. A Microsoft survey across 31 countries recently ranked “Prompt Engineer” near the bottom of roles companies plan to hire(Source).Job search trends reflect the change as well: according to Indeed, prompt-related job searches have dropped from144 per milliontojust 20–30(Source).
But this decline doesn’t signal the death of prompt engineering by any means. Instead, it reflects a field in transition. As use cases evolve from assistive to agentic AI, ones that can plan, reason, and act autonomously, the core challenge is no longer just about phrasing a good prompt. It’s about whether the model has the right information, at the right time, to reason and take meaningful action.
This is where Context Engineering comes in!
Suppose prompt engineering is about writing the recipe, carefully phrased, logically structured, and goal-directed. In that case,context engineeringis about stocking the pantry, prepping the key ingredients, and ensuring the model remembers what’s already been cooked. It’s the discipline of designing systems that feed the model relevant data, documentation, code, policies, and prior knowledge, not just once, but continuously and reliably.
In enterprises, where critical knowledge is often proprietary and fragmented across various platforms, including SharePoint folders, Jira tickets, Wiki pages, Slack threads, Git Repositories, emails, and dozens of internal tools, the bottleneck for driving impact with AI is rarely the prompt. It’s the missing ingredients from the pantry, the right data, delivered at the right moment, in the right format. Even the most carefully crafted prompt will fall flat if the model lacks access to the organizational context that makes the request meaningful, relevant, and actionable.
And as today’s LLMs evolve intoLarge Reasoning Models(LRM), and agentic systems begin performing real, business-critical tasks, context becomes the core differentiator. Models like OpenAI’s o3 and Anthropic’s Claude Opus 4 can handle hundreds of thousands of tokens in one go. But sheer capacity is not enough to guarantee success. What matters is selectively injecting the right slices of enterprise knowledge: source code, data schemas, metrics, KPIs, compliance rules, naming conventions, internal policies, and more.
This orchestration of context is not just document retrieval; it’s evolving into a new systems layer. Instead of simply fetching files, these systems now organize and deliver the right information at the right step, sequencing knowledge, tracking intermediate decisions, and managing memory across interactions. In more advanced setups, supporting models handle planning, summarization, or memory compression behind the scenes, helping the primary model stay focused and efficient. These architectural shifts are making it possible for AI systems to reason more effectively over time and across tasks.
Without this context layer, even the best models stall on incomplete or siloed inputs. With it, they can reason fluidly across tasks, maintain continuity, and deliver compounding value with every interaction.
Case in point:This isn’t just theory. One standout example comes from McKinsey. Their internal GenAI tool,Lilli,is context engineering in action. The tool unifies over 40 knowledge repositories and 100,000+ documents into a single searchable graph. When a consultant poses a question, it retrieves the five to seven most relevant artifacts, generates an executive summary, and even points to in-house experts for follow-up. This retrieval-plus-synthesis loop has driven ~72% firm-wide adoption and saves teams ~30% of the time they once spent hunting through SharePoint, wikis, and email threads, proof that the decisive edge isn’t just a bigger model, but a meticulously engineered stream of proprietary context (Source).
What Does ContextActuallyMean in the Enterprise?
By now, it’s clear that providing the right context is key to unlocking the full potential of AI and agentic systems inside organizations. But “context” isn’t just a document or a code snippet; it’s a multi-layered, fragmented, and evolving ecosystem. In real-world settings, it spans everything from database schemas to team ownership metadata, each layer representing a different slice of what an intelligent system needs to reason, act, and adapt effectively.
Based on my experience working across hundreds of data sources and collaborating with cross-functional product, engineering, and data teams, I’ve found that most enterprise context and information fall into nine broad categories. These aren’t just a checklist; they form a mental model: each category captures a dimension of the environment that AI agents must understand, depending on the use case, to operate safely, accurately, and effectively within your organization.
Read the full article on Packt’s Medium. If you’re new, make sure to follow our Medium handle and subscribe to our newsletter for more insights like this!
⭕ Fabric June 2025 Feature Summary: Microsoft Fabric's June update supercharges productivity: Power BI turns 10 with global celebrations and discounts. Fabric Notebooks now support variable libraries and Copilot-powered inline Python code suggestions. New features across Data Engineering, Science, and Warehouse boost modularity, scalability, and cost-efficiency. Real-Time Intelligence and Eventstream unlock deeper insights with enhanced AI, KQL, and SQL capabilities.
⭕ Simplifying Data Ingestion with Copy job – Incremental Copy GA, Lakehouse Upserts, and New Connectors: Need a faster, smarter way to move data in Microsoft Fabric? The latest Copy job update delivers big wins: Incremental Copy is now generally available for efficient delta transfers, Lakehouse upserts simplify data merges, and 15+ new connectors expand your options. Plus, on-premises to Snowflake and Fabric Data Warehouse ingestion now works seamlessly, no workarounds needed. It’s flexibility and speed, built-in.
⭕ How Cloud SQL boosts performance and cuts costs, per IDC: Struggling with performance, downtime, or scaling in your current database setup? IDC’s latest study shows how moving to Cloud SQL, Google Cloud’s fully managed service for MySQL, PostgreSQL, and SQL Server, drives real impact: 246% ROI, 28% lower costs, and $21.75M annual revenue gains. With AI integration, near-zero downtime, and faster deployments, Cloud SQL transforms database management into a growth engine.
⭕ New ObjectRef data type brings unstructured data into BigQuery: Unifying structured and unstructured data just got easier. Google Cloud introduces ObjectRef in BigQuery, enabling seamless processing of multimodal data like images, audio, and documents alongside tabular data. With full SQL/Python support, native AI integration, and unified governance, ObjectRef empowers teams to build GenAI-powered pipelines using familiar tools, no silos, no infrastructure overhead, just smarter data workflows.
⭕ How Stifel built a modern data platform using AWS Glue and an event-driven domain architecture: Looking to modernize your data architecture? Stifel shows how it's done, by building a scalable, domain-driven platform using AWS Glue, EventBridge, Lake Formation, and more. Their event-driven design enables real-time updates, decentralized data product ownership, and agile orchestration, boosting efficiency, customer experience, and ROI across business domains in a highly regulated financial environment.
⭕ Overcome your Kafka Connect challenges with Amazon Data Firehose: Facing Kafka Connect complexity or cost? Amazon Data Firehose now lets you stream data from Amazon MSK to Amazon S3, fully managed, serverless, and with zero connectors to maintain. The latest update adds custom timestamp offsets, making Kafka Connect migrations seamless. Enjoy auto-scaling, simplified delivery, reduced lag, and lower TCO with event-driven stream processing.
⭕ Alteryx: Appointing Arvind Krishnan as CTO for AI Growth: Alteryx has named Arvind Krishnan as its new CTO, signaling a bold push to scale its Alteryx One platform and AI Data Clearinghouse. With deep cloud and AI leadership from Salesforce and Bluecore, Arvind will drive innovation, platform stability, and cloud interoperability, strengthening Alteryx’s position in enterprise analytics and AI amid intensifying market competition.
See you next time!