Welcome to the third part of Xprocess article series where we focus our attention on creating custom dashboard and view reporting. As you know by now, Xprocess is our fully managed, adaptive business process monitoring platform. To refresh your knowledge on how to create your own metrics and define key performance indicators to help you quantify the level of attainment of your business objectives and targets, click here. To read more on how rules and alerts allow you to monitor your applications and business processes, while quickly identifying the root cause when any anomalies occur, click here.
The landing page of Xprocess will take you straight to your dashboards you created previously. Every dashboard will provide a top-down overview of Processes, Tasks, Alerts and Metrics from left to right.
The above example of valuation process shows it’s split into seven tasks that you can define yourself. Alert history will give you a better view of when the tasks display red or green (eg, when Asset value exceeds 5%). Alerts are generated each hour and by clicking on it you will see the Metrics related to it. In this specific Valuation example, you can see the exact asset values and the difference that triggered this alert.
The Timelines section provides an extra dimension to all of the previous information, allowing you to zoom into time zones related to Alerts, as well as history of those checks.
Reporting functionality will be available to you in the third tab on the left in the main menu. You can select the suitable timeframe. First graph will show you a graphical representation of the Metrics table on the right.
In a similar way, Actionstable represents alerts in a graphical manner, allowing for greater transparency on the specific days and times when alerts were triggered.
All the above reports are available when the dashboard is up to date and you have the option to be notified via e-mail as well for a quicker root cause resolution. On that note, stay tuned to read our subsequent article on how you can leverage root cause resolution to implement permanent solutions to your business processes. In the meantime, if you’d like to enquire about Xprocess please contact us. We’d love to hear from you!
Welcome to the second part of Xprocess article series where we will focus our attention on setting proactive alerts. As you know by now, Xprocess is our fully managed, adaptive business process monitoring platform. To refresh your knowledge on how to create your own metrics and define key performance indicators to help you quantify the level of attainment of your business objectives and targets, read our previous article in the series.
With 24/7 monitoring of your applications and business processes, Xprocess will automatically alert you and provide quick identification of root cause when any anomalies occur, allowing you to minimize downtime and focus on delivering business value. Setting up alerts will help you monitor data quality and the data acquisition process, bringing proactive monitoring and quick resolution.
As discussed in the first article on metrics, Xprocess helps you define attributes for your systems using custom metrics, and a healthy baseline for those attributes using rules. Metrics are sent through secure protocols to Xprocess using the open source agent or via integrated proprietary applications. Once you set up the rules engine, these will trigger alerts whenever measures and tasks are failing to function as expected.
As a simple example of alert configuration for a fixed element, you can configure one for when a threshold of a certain metric is based on a static value. If that threshold is breached, for example if metrics are not sent or numbers are lower than the threshold set, an alert will be triggered.
For more complex requests, where one metric is dependent on the previous one and hence contextual, alerts can be defined in a relative, rather than absolute way.
Rules and alerts can also be configured to notify the business when an error occurs, for example when a metric with data is received without error but the data quality is incomplete, helping you monitor the quality of your data.
Within Xprocess you can use default alerts or create your own, which can be configured as Dashboard colour, desktop pop-up or in the form of e-mail reports. E-mail alerts can also be customized to be sent to specific teams within the business and categorised by priority and colour codes.
Stay tuned to read our next article where we show you how to create custom dashboards and comprehensive reporting views. In the meantime, if you’d like to enquire about Xprocess please contact us. We’d love to hear from you!
We‘re looking for a hands-on DevOps Engineer who will lead the way for automation, continuous integration (CI), development and management of our SaaS environment. The ideal engineer will bring their experiences, best practices, and a collaborative attitude to help establish and drive our DevOps processes and initiatives.
Contribute to the definition and realisation of a container-based system architecture
Define and develop a deployment automation strategy
Collaborate with the development and product management teams to improve automation of workflows, infrastructure, code testing and deployment across our test and production environments
Support our clients with technical requirements for deployment and monitoring of our hybrid and on-site components
Help increase system performance with a focus on high availability and scalability
Create tools to empower developers and testers and to industrialise development, staging and production infrastructures
Document processes and procedures for transparency, resiliency and improved operations
What we’re looking for
Bachelor’s degree in Computer Engineering, Computer Science or similar field
1-3 years professional experience in a DevOps role in a software development firm
Experience with Amazon Web Services (AWS), in particular EKS, ECS, Fargate, EC2
Demonstrable experience of architecting and deployment of at least one microservices-based .Net solution to Docker / Kubernetes, ideally a migration from a monolith. Container configuration, provisioning, orchestration and clustering
Experience with scripting (Windows / PowerShell), deployment automation and application monitoring
Experience with TeamCity, Gitlab and CI/CD processes
This month we would like to shift your attention to Xprocess – our fully managed, adaptive business process monitoring platform. If you are looking to increase business resilience and transparency, then Xprocess is the perfect solution for you. It offers real-time visibility on business processes, by modelling and monitoring them proactively.
With Xprocess you can achieve the following:
Create your own metrics and define key performance indicators
Set up proactive alerts on real-time issues
Create custom dashboards and comprehensive reporting views
Leverage root cause resolution to implement permanent solutions
View historical data to identify trends and patterns
Maximize efficiency and minimize downtime
Integrate the platform with third party applications
In this multi-part series, we will discuss each of the above points.
Part I : Create your own metrics and define key performance indicators
XMon allows you to define various types of metrics to track the status of your business processes. Some of these metrics can be used as Key Performance Indicators (KPIs) to help you quantify the level of attainment of your business objectives and targets.
One of our recent enhancements enables you to send monitoring metrics directly from Xmon to Xprocess. This way, you can obtain an unparalleled view of these metrics allowing you to monitor the flow of requests and data.
Xprocess supports both Static and Historical metrics, with the latter allowing you to record a time series of the measurable value. A metric can have a number of attributes associated with it to define its characteristics or features. These attributes can be of different types or formats and can be optional or mandatory.
Metric definition and custom attributes window
Metrics can be sent to Xprocess from an Xmon agent or from any third-party program. The Xmon agent has a configurable engine that can interface to a variety of data sources such as files (xml, log, csv, etc.), databases (MS SQL, Oracle, etc.) and processes. Regardless of where the metrics come from, they are sent to Xprocess via the REST API.
Once metrics start flowing into Xprocess, they become available for a multitude of uses such as viewing timelines, building visualisations, or creating rules and alerts. For instance, an external process can send a Heartbeat metric to Xprocess every minute. Another example is a metric set up to receive and monitor the valuation of an asset manager’s portfolio, as shown below.
Metric table, showing collected metric values
Metric value graph, showing values over time
Stay tuned to read our subsequent articles about the other features and functionalities of Xprocess. Our next article will explain how rules and alerts can be defined for metrics to help you monitor your processes proactively and resolve anomalies in a timely fashion. In the meantime, if you’d like to enquire about Xprocess please contact us. We’d love to hear from you!
REST APIs are a modern application programming interface used for web service based integrations. REST APIs have become a standard in the industry for system integrations as a scalable, reliable and standardized way to programmatically access functionality and data.
Our suite of products, Xmon, Xprocess and the new Xcatalogue are API-first applications. They are built on top of a solid REST API foundation to allow for better development, testing and integration within our products. We make certain endpoints available to our clients so they can benefit from, amongst others, the following:
Automated report generation and data extractions, which allow instant data extractions into third party platforms, such as Tableau or Power BI. By leveraging the tools already at your disposal and powerful analytics available in Xmon for example, you will have the ability to create reports on the fly, tailored to your needs and centralized within your own reporting frameworks and tools.
Integration of Xmon analytics into proprietary or in-house developed systems. One perfect example is simulating the cost of a data request, making it easier and quicker to oversee budget spend, directly from internal applications. This will allow users to have direct access to analytics that go hand in hand with your existing data, all within the same environment.
More advanced, hybrid types of integration where on site agents are deployed and communicate with the Xpansion SaaS backend. In this type of deployment, you will get to monitor requests and data calls made between internal applications which would otherwise not be possible. This type of integration is also suitable when using private leased lines. Advanced integrations using the REST API also allow for automated provisioning of rules, consumers and data sources for a fully automated DataOps workflow using our suite of products.
All endpoints are available over secure HTTPS transport with advanced permissioned access, whitelabelled access and full audit trail for traceability and accountability.
If you’d like to know more about our REST API integration and possibilities, do reach out, we’d love to hear from you! To speak to one of our experts, click here.
We are happy to announce the recent launch of our new and improved website designed with our client’s needs in mind.
It is full of features for a more interactive and comprehensive dive into the world of Xpansion, with a detailed description of our product suite – Xmon, Xprocess and Xcatalogue – as well as our partners. Have a look for yourself here
We’d love to hear what you think, request a demo if you’re interested in joining our rapidly growing client base!
Existing clients will also have noticed the new user interface in line with Xpansion website theme on our website, as part of our latest software release.
In June 2019, we wrote about the three pillars for establishing a Data Operations function to provide access to data in a transparent, controlled and reportable way.
XMon supports Data Operations teams and enables access to data at scale, without losing the grip on usage compliance and spend. The latest XMon release includes a dedicated DataOps module to allow for the management, granular attribution, simulation and reconciliation of data usage.
Viewing and tagging requests
Understanding which consumers are requesting data and for what purpose is a basic principle for data usage transparency. This being said, it is not an easy principle to implement. With the XMon request tagging functionality, Data Administrators can extract all data requests (both internal and external) and assign metadata attributes to these requests, manually or through automated tagging rules. This allows a granular attribution of the data request to a specific consumer (e.g. Back Office Team), a specific business process (e.g. Risk Report) or any other custom defined attribute. Tags are then used by the XMon Data Analytics engine to provide in-depth analytics and a level of detail for usage reports that is not possible to achieve with traditional methods.
Data Attribute and Security Lookup
With the XMon DataOps Attribute and Security Lookup feature, Data Administrators can lookup any security reference or data attribute (field) used within the organization. This can then be tied back to a specific consumer or business line. Operational tasks like usage reporting and pruning of expired or unnecessary securities and attributes becomes much easier, much more efficient and more transparent.
Cost simulations and compliance checks
One of the tasks of a DataOps team is to ensure that any changes to data requests do not incur unexpected cost spikes and that data requests are compliant to internal and external usage agreements. With the XMon DataOps cost simulation functionality, Data Administrators can obtain the cost of a data request by simply uploading through the XMon DataOps interface. In addition to obtaining relevant cost information, XMon will validate the request against a set of compliance and data access rules to ensure it doesn’t breach any terms.
Reconciling invoices and reporting
The XMon Invoice Reconciliation feature allows Data Administrators to validate vendor invoices by independently re-calculating and re-building data invoices. This important step provides third-party validation of invoice amounts and ensures all parties have a clear understanding of the way invoices are calculated and how data is charged.
The XMon DataOps module centralizes data operations, increasing efficiency, ensuring compliance and reducing spend. Contact us for more information about the features available or how we can help your organization take control over reference data.
Microsoft Excel is used extensively by all users in an organization. In the context of reference data, users can create spreadsheets containing data vendor formulas (e.g. Bloomberg Reference Data or Refinitiv) to download reference data and use it for custom reports, analysis and bespoke calculations.
Although extremely useful from a user perspective, downloading reference data in Excel has two pitfalls that can challenge the most hardened DataOps teams. First, reference data is subject to usage compliance terms and controlling how data is used once it is in a spreadsheet is notoriously difficult.
Second, reference data has associated costs and providing ubiquitous access to reference data directly from a spreadsheet can cause cost spikes and uncontrolled increases in spend.
XMon addresses these challenges and helps organizations take control of reference data usage in Excel workbooks
How XMon can help
In early June, we released a new XMon feature to help organizations better manage reference data access in Excel, namely by:
Automatically detecting Excel workbooks that contain reference data access formulas
Allowing DataOperations teams to track which securities and associated fields are being pulled in Excel
Allowing DataOperations teams to create cost and compliance rules to be notified in real-time if an Excel workbook breaches cost or compliance terms
What this looks like in XMon
DataOps analysts can upload a given workbook to XMon directly or use the XMon tracker agent to detect workbooks with vendor reference data calls. In both cases, XMon will inventory the workbooks found and list them for access. The screenshot below shows what this looks like:
Clicking on the icon displays the workbook factsheet and associated reference data cost, list of fields and list securities detected:
Using the XMon rules engine, data administrators can create compliance rules so they can be immediately notified in case of non-compliance or in case of expensive requests.
Audit and compliance reports
Reporting on reference data usage is notoriously difficult and coupling this with the requirement to understand which users are retrieving data in Excel workbooks can put a heavy load on DataOps teams trying to ensure compliance and keeping costs in check. With XMon, generating evidence backed usage reports becomes much easier, providing data analysts with the tools necessary to track costs and compliance in real-time and generating accurate and timely usage reports.
Reach out for more information about how XMon can help your organization understand, control and derive insights from your reference data usage.
XMon is known for its ability to track reference data calls and provide in-depth analytics of usage metrics, consumption, cost allocation and spend optimization. Our focus is reference data, but XMon’s core engine is able to process nearly any type of data.
To put this into perspective, we decided to source COVID-19 data and to plug it into XMon. We generated evolution graphs and set up alerts to be proactively notified of the status of the pandemic and of its development globally and per country.
We also had a brief look into whether Covid-19 had any significant impact on reference data consumption across our customer base.
Here’s what we did.
First, it was important to find a reliable source for Covid-19 data that we could connect XMon to. We decided to source ours from the Covid-19 free API, available here:
The API is easy to integrate with and provides historical data in JSON format, sourced from the Centre for Systems Science and Engineering (CSSE) at Johns Hopkins University.
Designing the XMon Metric
Once the data source was decided on, we configured an XMon metric with sufficient attributes to enable us to run reports and process automated alerts. We were interested in recorded the following attributes:
Timestamp of data
Number of new cases (daily and cumulative)
Number of recovered cases (daily and cumulative)
Number of deaths (daily and cumulative)
Once set up in the system, the XMon metric looked like this:
The XMon data collection agent was configured to obtain data historically, starting March 1, 2020 and automatically on a daily basis after the historical upload was completed.
Once enabled, data started flowing into the XMon dashboards and in a matter of minutes was available in the monitoring dashboards, the graph below shows the number daily confirmed cases for the UK since March 1, 2020:
The daily deaths in the United Kingdom shows a spike on the 29th of April, which corresponds to the UK government’s addition of deaths in care homes:
Automated Notifications and Alerts
While data was being collected, we configured automatic analysis rules to detect and proactively notify then team when a fall in the number of daily new cases over a 3 consecutive day period was detected, per country. The screenshot below shows how the rule was defined in XMon for Australia:
Displayed in a colour coded dashboard, we notice that all countries we were monitoring still had an increase in the number of cases over three consecutive days in the observed week (27th April 2020 – 03rd May 2020):
The XMon Analytics engine continues to process and analyse data and generate automated alerts when the number of Covid-19 cases decreases over a three day period per country.
Once historical data acquisition was completed, we ran historical trend analysis graphs, which showed the evolution of the daily deaths attributed to Covid-19 for selected countries of interest as well as the 7-day moving average:
Split by region, we obtain the graph below, showing the number of daily deaths across South America, Oceania, North America, Europe, Asia and Africa:
Did Covid-19 affect reference data request volumes?
An interesting question was to see whether there was a noticeable impact of the Covid-19 pandemic on reference data consumption across our client base. We ran a simple scenario to determine whether data volumes had been impacted since March 1, 2020 and compared it to the same period last year. Stripping out variations due to active XMon cost optimisations as well effects of significant business changes, we notice that there was nearly no difference in data requests over the periods of interest, if anything things may be on a lower trend:
XMon provides powerful data ingestion and processing engines that are used to process vast amounts of requests for reference data across our clients. The XMon engine can, and has also been used to track other types of data, for example internal data requests flowing within the organization, or, as this article has briefly shown, scientific data related to the global pandemic we are all witnessing at the moment.
Reach out for more information about the provided analytics, graphs or alerts, or to see how the XMon team can help make sense of your data, whether it’s reference data or otherwise.