DinoAI + MCPs: Hyper-Relevant AI for Analytics Engineering
DinoAI now connects to external tools through Model Context Protocols (MCPs), bringing context from Jira tickets, web searches, and documentation directly into your development workflow. See how these integrations streamline fixing failed pipelines, learning new features, and implementing best practices—all without switching between applications.

Parker Rogers
Apr 16, 2025
·
7
min read
Analytics engineers work across multiple tools and sources of information. Your data warehouse holds your tables and metadata, your repository contains your code, Jira tracks your issues, and documentation lives across various websites and PDFs. Until now, bringing all this context together required constant context-switching between applications – a significant productivity drain that interrupts your flow and slows development.
Today, we're excited to introduce a major enhancement to DinoAI – our "cursor for data" – that brings external context directly into your development workflow through Model Context Protocols (MCP
Watch the full demonstration on YouTube.
What are Model Context Protocols?
Model Context Protocols (MCPs) are standardized interfaces that allow DinoAI to connect with external sources of information. Think of them as USB ports for AI – a unified specification that enables your development environment to tap into various information sources.
As Kaustav explained during our livestream: "Model context protocols are like USB. It is a unified specification through which you can bring various different information into your AI context."
While DinoAI has always excelled at understanding your data warehouse and repository context, MCPs expand this capability to include:
Issue tracking systems like Jira
Web search through Perplexity integration
Documentation from various sources
Terminal commands for git and dbt operations
The result is a more comprehensive, contextually aware AI assistant that can help with real-world scenarios that span multiple systems.
Real-World Scenario: Fixing Failed Production Pipelines
In our demonstration, we showed how DinoAI with MCPs can streamline a common workflow: fixing a failed production pipeline. Here's how it works:
1. Jira Integration for Error Context
When a pipeline fails in Paradime Bolt, a webhook automatically creates a Jira ticket with detailed error information. Using DinoAI's new Jira integration, we can provide this context directly to the agent:
I have a production branch that failed this morning. Details are in Jira ticket PARA-123. Can you help me resolve it?
Upon authorization, DinoAI fetches the ticket details and immediately has access to all the error information, eliminating the need to manually copy error logs.
2. Terminal Integration for Git Workflow
With the error context in hand, DinoAI doesn't just suggest fixes – it can now execute them through terminal integration. In our demonstration, DinoAI:
Checked the current Git branch
Created a new branch following naming conventions (using the Jira ticket ID)
Identified and fixed the error in the relevant file
Ran
dbt run
to verify the fixCommitted the changes with an appropriate message
Pushed the branch and suggested creating a merge request
All of this happened without requiring the user to type a single Git command or switch contexts – DinoAI suggested the terminal commands and executed them with user approval.
This streamlined workflow is particularly valuable for team members who might be less familiar with Git operations, as it guides them through best practices while resolving the issue.
Accessing Up-to-Date Knowledge with Perplexity
Another powerful MCP we demonstrated is integration with Perplexity for web search. This addresses one of the key limitations of traditional LLMs – their knowledge cutoff dates.
When exploring newer features like dbt unit tests, DinoAI can now search the web through Perplexity to find the latest documentation, examples, and best practices:
I want to learn about dbt unit tests. How do they work and how can I implement them in my project?
The Perplexity integration returns:
Links to official documentation
Examples from community resources
Best practices based on real-world implementations
Step-by-step guidance for implementation
This isn't just a simple web search – it's an AI-curated exploration of relevant resources that DinoAI can then use to help implement the feature in your specific project.
As our demonstration showed, after learning about unit tests through Perplexity, DinoAI can immediately help implement them in your project using the context of your specific models.
Paradime Documentation Search
For Paradime-specific features, we've also added a dedicated web search tool that focuses on our documentation. This allows DinoAI to quickly answer questions about platform features and best practices:
How can I set up column-level lineage in Paradime?
The web search returns specific documentation about column-level lineage, including setup instructions, use cases, and integration details – all without leaving your development environment.
The Power of Combined Context
What makes these integrations truly powerful is how they work together. In our demonstration, we showed a complete workflow that combined:
Warehouse context: Understanding the data structure
Repository context: Identifying the relevant files and code
Jira context: Understanding the specific error
Terminal integration: Executing Git and dbt commands
Web search: Learning about new features and best practices
This holistic approach transforms DinoAI from a helpful code assistant to a true development companion that understands your entire workflow and toolchain.
Why MCPs Matter for Analytics Engineering
Analytics engineering workflows are inherently cross-functional and involve multiple systems. The ability to bring context from these different sources into a single assistant creates several significant benefits:
Reduced context switching: Stay in flow by keeping all relevant information in one place
More accurate assistance: By understanding the full context, DinoAI produces more relevant and accurate suggestions
Accelerated learning: Quickly understand new features or concepts without extensive research
Streamlined troubleshooting: Fix issues faster by combining error context with code understanding
Guided best practices: Follow established patterns for Git operations, testing, and documentation
Getting Started with DinoAI MCPs
DinoAI's MCP integrations are available now in Paradime. Here's how to get started:
Jira integration: Connect your Jira instance in your organization settings
Web search: Available directly in the DinoAI interface under Tools
Terminal integration: Enabled by default in the DinoAI interface
As Kaustav emphasized during our livestream, we've designed these integrations to require minimal setup: "When we started Paradime and how we have always built Paradime, it's about the ease of use. It's about removing friction when it comes to development and infrastructure work."
Beyond Fixing Bugs: Expanded Use Cases
While our demonstration focused on fixing a failed pipeline, MCPs enable many other powerful workflows:
Feature implementation: Use web search to understand best practices, then implement with warehouse and repository context
Documentation generation: Create comprehensive documentation by combining code understanding with external examples
Knowledge sharing: Accelerate onboarding by providing contextual guidance that spans multiple systems
Exploratory analysis: Combine warehouse understanding with external domain knowledge for more insightful analysis
The Future of MCPs
This is just the beginning of our MCP journey. We're actively working on additional integrations to bring even more context into your development environment:
PDF context: Import business requirements and specifications directly from documents
BI tool integration: Understand dashboard definitions and metrics
Meeting notes: Incorporate context from collaboration tools
Custom knowledge bases: Connect to your organization's specific documentation
Try It Today
DinoAI's MCP integrations are available now in Paradime. If you're looking to accelerate your analytics engineering workflow by bringing external context directly into your development environment, these features can dramatically reduce the friction in your daily work.
Current Paradime users can start using these MCP integrations immediately. New users can try DinoAI for free today and experience how bringing together multiple sources of context can transform your development workflow.
In our next livestream, we'll be showcasing more advanced uses of these integrations, demonstrating how they can be combined to solve even more complex analytics engineering challenges. Stay tuned!