With a single wrong piece of data; a business can drive customers away or lose them forever. Real-time data drives business and customer experience. Real-time “data integration” is imperative for companies to operate efficiently and make informed decisions. In the modern world, to stay ahead of the curve, the different systems should be capable of real-time data exchanges.
A business analyst must engage with stakeholders to create an effective real-time data exchange model. The outcome of engaging with the stakeholders in workshops, brainstorming sessions, and focus should be elicited requirements. Business rules must be analyzed to ensure the data exchanges meet the end business objective. Remember, this is an iterative process, and we might have to refine the data model as more discoveries about the business processes are made.
Let’s go through the process with a real-world example.
Once the elicited requirement is documented, this requirement needs validation from the relevant stakeholders to ensure the correct requirement has been captured. While validating the requirement, evaluate whether the requirement is in line with the solution scope and organizational goals. This is a good time to define measurable evaluation criteria.
Next, update the functional requirements document with the confirmed requirement for the data exchange model to be defined. Share the business analysis information with all the stakeholders for visibility. Use a common repository to store business analysis information, e.g., Confluence, SharePoint, etc. A confirmed business requirement would look like this:
The subsequent step involves assessing the change and defining the logical and physical model to deliver the required outcome. As there is an exchange of data and information in this scenario, a data dictionary is an ideal technique to specify the model. A data dictionary definition allows one to identify any information gap. There needs to be close engagement with the data stewards to ensure the logical data model aligns with the data governance framework.
A well-maintained data dictionary is crucial in this process, providing a clear and consistent reference for data definitions and ensuring seamless communication across systems.
Field Name |
Description |
Type |
JSON Key |
Example |
Message Event |
The name of the event for the message. |
String |
messageEvent |
Sales Event |
Event Timestamp |
The timestamp of when the event occurred, in ISO 8601 format. |
DateTime |
eventTimestamp |
2024-05-05T09:30:00Z |
Store Location Identifier |
Unique identifier of the store where the item was sold. |
String |
storeLocationIdentifier |
90959 |
An array of Items Details |
Information about the item(s) sold |
Array |
itemDetails{} |
|
Field Name |
Description |
Type |
JSON Key |
Example |
Stock Keeping Unit (SKU) |
A unique identifier for the item in the inventory system. |
String |
stockKeepingUnit |
109867621 |
Item Name |
The short name of the item sold |
String |
itemName |
Black Jumper |
Item Identifier |
Unique Identifier of the item. |
String |
itemIdentifier |
898192382132 |
Quantity |
The quantity of item(s) ordered. |
Integer |
quantity |
5 |
This logical definition is converted to a deliverable story in the sprint for the developers to create a physical model. Define robust acceptance criteria for the story. The acceptance criteria assist the developer in designing the solution.
The physical model after the development and testing would look like this:
{
"messageEvent": "Sales Event",
"eventTimestamp": "2024-05-19T18:30:00Z",
"storeLocationIdentifier": "90959",
"itemDetails": {
"stockKeepingUnit": "109867621",
"itemName": "Black Jumper",
"itemIdentifier": "898192382132",
"quantity": 5
}
}
The model, once available, would require mapping between the producing and consuming systems.