Data is the tangible representation of history and it can be stored in many forms and variations. However, there must always be a 'SOURCE of TRUTH' . This is a requirement that is a natural fit for microservices cloud solution designs. By pushing all data driven activities through the Queue, it provides a natural mechanism for updating the 'SOURCE of TRUTH'. From any change of the SOURCE of TRUTH data, events will reverberate data updates to staged data. Staged Data comes in many forms and storage types/locations.
Each application should have the dependent data designed and tailored. For example, Data presented to a visual layer (like a web page) should be stored in a format like JSON to make the data binding to UI Components seamless. In addition, the data should be 'highly available' so the speed of acquisition is as fast as possible. Data integrations need to be in a like-kind types so as to minimize any development efforts and processing resources required for transformation.
Data availability is the key to providing rich and speedy applications.
Data Lakes and Data Staging are becoming more prevalent as the cloud becomes a more common place hosting landscape for enterprise systems. As this design architecture becomes more established, the software development instustry can rethink the need for mapping protocols. As I have worked on cloud development teams over the last 6 years, I have seen the enormous amount time developers dedicate to the process of mapping data exchanges. Eliminating the need for data mapping will reduce typical development costs by 30%.
Data provides life to our applications. It provides context for information used for decision-making. The time to acquire data can be the single greatest factor to a user's experience. However, developers are so focused on learning JS Frameworks, Mapping Data, Compilations, Package Dependencies, CI/CD, just to name a few, that they can't focus on the solutions that will improve the user's experience with software. I believe that we as an industry need to spend more time reducing unnecessary efforts.
The diagram above illustrates a number of methods of modifying a traditional SQL Server to facilitate the creation of JSON and the elimination of traditional data mapping.
Working on my first microservices project for Experian, I discovered a process being executed on every stream of developers. We were all very new to JSON Payload consumption and production. I noticed that each developer created a C# Model used for casting the JSON payload. Most developers are familiar with the concept of 'brittleness' but I wasn't sure they saw their actions as creating this problem.
At that point, I took it upon myself to create a process of parsing JSON and mapping values to other JSON objects (including values requiring programming). Please read my white paper (JSON Mapper) to explore the framework further.
Copyright © 2023 Fundamental Technology - All Rights Reserved.
Powered by GoDaddy
A Place to Explore Programming that is Efficient and Elegant. Frame(s)work is designed for those looking to interact directly with HTML5.
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.