Table of Contents
- 1. Merchants access conversational payments data
- 2. Expansion of Pagos’ MCP server
- 3. Conversational access to payment data
- 4. MCP server features
- 5. Benefits for merchants
- 6. Payment operations optimization
- 7. Impact on the payments industry
- 8. Conclusions on conversational access to payments data
Merchants access conversational payments data
- Pagos expanded its server based on the Model Context Protocol (MCP) so that merchants can query their own payments data through AI agents such as ChatGPT, Claude, and Gemini.
- The pitch: verified answers to complex questions without “diving” into dashboards, thanks to aggregated, harmonized, and enriched data.
- The server exposes metrics such as approval rates, fee breakdowns, processor performance, and chargeback reasons, among others.
- The expansion relies on no-code integrations and an ingestion API to connect processors and unify fields that vary across networks and acquirers.
Queries on unified data
When “verified and harmonized data” is mentioned here, it’s not about the model “guessing” the answer, but rather that the conversation is grounded in the merchant’s own dataset (events and metrics) already unified across sources.
In practice, the leap versus a traditional dashboard is that the natural-language question is translated into queries over normalized fields (for example, decline reasons or issuer names), which reduces ambiguities typical of processor-by-processor reporting.
Expansion of Pagos’ MCP server
Pagos, an AI-powered payments intelligence company, announced a “major” expansion of its MCP (Model Context Protocol) server to enable merchants to access their verified payments data directly from conversational AI agents.
The move extends the bet launched in June 2025, when Pagos released an open-source MCP server aimed at enabling language models and AI tools to query real-time BIN-level (Bank Identification Number) data from card networks. Now, the focus shifts from network data to the merchant’s operational data: aggregated transactional events and performance metrics.
Integrated real-time queries
Timeline (what changed and what it enables):
1) June 2025: open-source MCP server for real-time BIN-level queries from card networks.
2) Announced expansion: Pagos moves to hosting an MCP that exposes to the merchant its own aggregated dataset (transactional events) and operational metrics (approval, fees, performance by processor, disputes).
3) Expected result: the payments team queries “in conversation” within its usual AI agent, instead of switching between exports and dashboards.
In the words of Klas Bäck (CEO and cofounder of Pagos), the intent is for payments operators to “want answers now” within their existing AI workflows.
The core promise is to change how payments teams query and analyze information: instead of navigating dashboards or requesting reports, they can ask questions in natural language and get verified answers.
In this context, “verified” means the answers are grounded in the merchant’s own data that Pagos aggregates and harmonizes (for example, transactional events and operational metrics), rather than relying on estimates or the model’s assumptions.
Pagos gives examples of the kinds of queries it aims to solve: “Which decline codes drove the spike in declines on February 3?” or “What are my top dispute reasons this quarter?”. The idea is that the payments professional gets context and explanation in a conversation, backed by consolidated data.
According to Albert Drouart, CPO and cofounder of Pagos, the goal is to reduce the time spent exploring raw data and deliver “instant and actionable insights” based on “verified and reliable” information.
From Question to Verified Answer
How you get to a “verified” answer (quick mental model):
1) Question: the operator formulates a query in natural language (e.g., declines, fees, disputes).
2) Query: the AI agent calls the MCP server to query the merchant’s dataset.
3) Normalization: fields are read through a harmonized layer (same names/formats across processors and networks).
4) Answer: the agent returns metrics and breakdowns (by processor, region, network, etc.) and the “why” based on that data.
Useful checkpoint: if the question mixes internal definitions (e.g., “net approval” vs “gross”), adding that “business context” helps ensure the answer uses the same definition as the team.
MCP server features
Pagos claims the server relies on three capabilities: data consolidation, harmonization, and enrichment. In practice, the merchant connects its processors to Pagos via no-code integrations or an ingestion API; from there, the MCP enables querying the unified set from the chosen AI client.
Harmonization involves normalizing fields that can vary in name and format across processors, networks, and products (for example, issuer and decline reason/code), so that the same question returns consistent results across different sources.
| Capability | What it solves | Example in payments | Why it matters in conversation |
|---|---|---|---|
| Consolidation | Bring scattered data together in one place | Combine events from multiple processors/acquirers | Avoids “which dashboard was that number on?” |
| Harmonization | Make fields and values comparable across sources | Map decline_reason/decline_code to a consistent taxonomy | Lets you ask once and get consistent results |
| Enrichment | Add attributes that don’t come in the base event | Fill in issuing bank, card type, brand, method, routing options | Provides context to explain causes and propose actions |
Real-time data access
The server is designed to respond over the merchant’s payment data streams—transactions, refunds, declines, and chargebacks—and return operational metrics such as:
- approval rates,
- fee breakdowns,
- performance metrics by processor,
- and other indicators derived from aggregated events.
In the previous stage (2025), the emphasis was on real-time BIN-level queries from card networks; with the expansion, Pagos incorporates the merchant’s direct access to its “layer” of harmonized data, with the goal of speeding up diagnostics and decisions.
Integration with artificial intelligence
The integration is done through connectors in the AI clients themselves (for example, ChatGPT, Claude, or Gemini), allowing analysis to happen within the team’s usual workflow.
According to the launch description, merchants with a Pagos account and payment data sources already connected can add the MCP server from their AI client’s connector settings and start querying their data conversationally.
Pagos also highlights the possibility of adding “business context” to refine answers: for example, internal definitions of metrics, relevant segmentations, or operational rules. In addition, the company notes that its internal chatbot, Pagos AI, had already been working as a conversational interface within its platform for nearly two years; the MCP aims to bring that experience to external tools that customers already use.
Benefits for merchants
The expansion targets a recurring problem in payments: data fragmented across processors, inconsistent formats, and long times to arrive at an actionable explanation. Pagos frames the launch around three trends: leaner teams, demand for instant answers, and the consolidation of AI as the business intelligence interface.
Key questionswith actionable metrics
Examples of questions (and what metrics they usually return) when data is aggregated/harmonized:
– “Which decline codes explain the spike in declines on February 3?” → approval rate/decline rate by hour, by processor, by region, by brand; distribution by code/reason.
– “What are my top dispute reasons this quarter?” → count and amount of chargebacks by reason, by product/channel; trend vs. prior period.
– “How did my fees by processor move this month?” → breakdown of fees by processor, by method, by country; variation vs. baseline.
The practical value appears when the same question can be sliced by dimension (processor, region, brand) without manually re-labeling fields between reports.
Improved decision-making
With harmonized data—for example, normalizing fields such as the issuer name or the decline code/reason, which can vary across processors, brands, and products—merchants can compare “apples to apples” and detect root causes more quickly.
Operationally, this can translate into more agile decisions on routing, configuration adjustments with acquirers, or prioritizing actions to reduce declines and disputes.
Operational efficiency
The conversational approach aims to cut manual tasks: less time exporting reports, joining tables, or interpreting different nomenclatures across providers. Pagos positions the server as a way to “deliver instant expertise” without relying on consultants or analyst reports.
Payment operations optimization
Beyond answering questions, Pagos proposes an additional step: bringing payments intelligence “from the dashboard to the workflow.” In its view, MCP makes it possible to build agentic flows that connect payments data with the merchant’s internal systems and automate tasks such as monitoring, alerts, or routing recommendations.
The enrichment component—backed by Pagos’ BIN database—adds attributes such as issuing bank, card type, brand, payment method, and alternative routing options, in order to contextualize events and speed up corrective actions.
High-Impact Operational Use Cases
Checklist of operational use cases that tend to “pay off” quickly once data is harmonized:
– Daily monitoring: are approvals and declines by processor/region within range?
– Spike diagnosis: identify in minutes the top codes/reasons that explain an anomaly.
– Provider performance: compare processors with the same definition of metrics.
– Disputes/chargebacks: top reasons, rates by channel/product, and changes vs. the previous quarter.
– Fees: detect variations bymethod/country/processor and prioritize where to investigate.
– Routing: test hypotheses (e.g., “if I move X% of traffic, what happens to approval/fees?”) before executing changes.
Checkpoint: if an answer suggests an action (e.g., changing routing), it’s advisable to validate the cut (segment, country, brand) to avoid conclusions from mixed populations.
Impact on the payments industry
The expansion of the MCP server fits into a broader trend: AI as a query and orchestration layer over enterprise data. In payments, where complexity grows due to the multiplicity of acquirers, networks, methods, and rules, the ability to ask and obtain verified answers can become a competitive differentiator.
Payments also underscores a sensitive point for the sector: the server does not share data between clients, keeping access limited to the merchant’s own information. In a market where payments observability is often fragmented, the proposal to unify and “translate” data for conversational consumption puts pressure on traditional analytics providers and orchestration platforms to accelerate integrations with AI agents.
Balance between speed and accuracy
Trade-off introduced by adding a conversational layer over payments data:
– Pros: speed (less “dashboard digging”), self-service for lean teams, and more consistency when fields are harmonized.
– Cons: risk of misinterpretation if the question is ambiguous (internal definitions, time windows, segments) or if the user does not validate the correct cut.
– Practical mitigation: harmonization reduces errors due to nomenclature, but it remains key to set definitions (“what is approval,” “what do fees include”) and review the breakdown by dimension before acting.
Conclusions on conversational access to payments data
The expansion of the Pagos MCP server aims to turn conversation into the new interface for payments operations: less friction to query, more speed to explain and act, and a layer of harmonized data that reduces ambiguities across sources.
Transformation of the payments sector
If the model consolidates, the day-to-day work of payments teams may shift from manual exploration to oversight and assisted decision-making, with complex questions resolved in seconds and with traceability to consolidated data.
Advantages of using artificial intelligence in data management
The value is not only in “using AI,” but in combining it with verified, normalized, and enriched data. In payments, where the same phenomenon can be seen
different depending on the processor, that harmonization is key for the conversation to produce useful and comparable responses.
The future of access to real-time data
The direction set by Pagos suggests a future where access to critical metrics—approvals, fees, disputes, performance by provider—is integrated directly into conversational tools and automated workflows, reducing the time between detecting a problem and executing a fix.
Transform your customer experience with Suricata Cx
Suricata Cx is presented as a proposal to elevate the customer experience through more agile and scalable operations, especially in high-volume interaction environments.
Unified context for actionable conversations
A practical bridge between “conversational payments” and customer experience:
– When support/sales/collections work with context (payment status, failed attempts, reason for rejection, fees, or disputes), the conversation stops being generic and becomes actionable.
– The same harmonized-data logic helps different channels (call center, WhatsApp, portal) speak the same operational “language,” avoiding inconsistent responses.
– In omnichannel operations, traceability (what was asked, what metric answered, and what action was taken) is what makes it possible to scale without losing control.
The comprehensive solution for ISPs and telecommunications operators
Aimed at internet providers and telco operators, the platform seeks to unify sales and support processes, with a management layer that facilitates service and follow-up.
Cost optimization and improved customer satisfaction
The approach combines operational efficiency with improvements in response and resolution times, two variables that often directly impact service perception.
Scalability and efficiency in sales and support operations
The promise is to support growth—more customers, more inquiries, more channels—without proportionally multiplying costs, relying on automation and process standardization.
Access to payment data on Pagos’ MCP server anticipates a key shift: moving from fragmented dashboards to operational decisions guided by conversation and verified data. In Suricata Cx, that same logic translates into service and collections experiences where AI converses with real context—including payment status—and maintains traceability so teams can act faster and with control.
This analysis focuses on the operational angle: how
the combination of harmonized data and conversational interfaces can reduce friction in “lean” teams and speed up decisions in support, sales, and collections workflows, a pattern that Suricata Cx observes recurrently in high-volume omnichannel operations.
This information is based on public announcements and descriptions available as of the time of writing. Exact availability, customer-specific scope, and integration details may vary depending on configuration and product evolution. In payments technology, field names and operational definitions often differ between providers, so it is advisable to validate and align metrics before automating decisions.

Martin Weidemann is a specialist in digital transformation, telecommunications, and customer experience, with more than 20 years leading technology projects in fintech, ISPs, and digital services across Latin America and the U.S. He has been a founder and advisor to startups, works actively with internet operators and technology companies, and writes from practical experience, not theory. At Suricata he shares clear analysis, real cases, and field learnings on how to scale operations, improve support, and make better technology decisions.

