Skip to content

Commit

Permalink
Enhancement/nodeapis resolvingissues (#68)
Browse files Browse the repository at this point in the history
* Updates:

- dataattributegenerator.controller.js - made the names more understandable
- mgmt_automtd_dataattribute_generation.js - resolved a cont variable issue causing runtime issues.
- Resolved API endpoint inconsistency and create a random data query API named randomdata.controller.js.
- Created an operational random query node code set. This will help for those implementations that are more than API only

* added latest random query generated in node script with parameters

* added latest random query generated in node script with parameters

* Updates:

- Enhanced randomdata_queries.js to properly output single transactions

* Updates:

- Enhanced Existing data API, this involves all the dataexisting.controller.js code to ensure that better json responses were provided, when data is not returned a consistent message is returned with some details for the requestor

* Updates:

- Created a generatedata_industrystds.js file so data generation can be run from the command line as needed

* Updates:

- Revised the DataGenerated APIs in datagenerated.controller.js to better formate the api output.

* Updates:

- Enhancements to Node-APIs README.md.
- Enhancements to Usage-Node-Assets.md based on testing and active implementation feedback.
- Cleanup of the node code modules generatedata_dataattributes.js and generatedata_datastructures.js

* Updates:

- Enhancements to datamodel.controller.js, dataplatform.controller.js, implementationdata.controller.js, randomdata.controller.js to enable better Api responses.
- Cleaned up implementationdata.controller.js queries to perform better and be more accurate.

* Updates:

- Enhanced Existing data API, this involves all the referencedata.controller.js code to ensure that better json responses were provided, when data is not returned a consistent message is returned with some details for the requestor
- Enhanced Existing data API, this involves all the termsdata.controller.js code to ensure that better json responses were provided, when data is not returned a consistent message is returned with some details for the requestor

Co-authored-by: Jonathan Myer <jonathanmyer@Jonathans-MacBook-Pro.local>
Co-authored-by: Alan Scott <balanscott@outlook.com>
  • Loading branch information
3 people authored Sep 10, 2022
1 parent 7e41407 commit ebba904
Show file tree
Hide file tree
Showing 19 changed files with 943 additions and 228 deletions.
92 changes: 47 additions & 45 deletions DataTier-APIs/Node-APIs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,48 +4,52 @@ There is no specific plans to ONLY have one technology for APIs. Currently, we a
the best way to address and keep feature parity because we want to ensure that we dont limit
technology.

For these assets you will want to ensure you have the needed versions of Node, npm and yarn installed and working for your environment.
For these assets you will want to ensure you have the needed versions of Node, npm and yarn installed and working for
your environment.

# Settings
The biggest thing to understand is that all settings for this are contained within a .env file. It is important to know
that if you clone the repository the file WILL NOT be included or created. You must manually create a .env file and
the settings used are defined below.
The biggest thing to understand is that all settings for this solution are done through environment variable.
It is important to know that if you clone the repository the file WILL NOT be included or created.

Here is the real world example of the environment variables:

```
# Platform Settings
export httpPort=8001
export runQuantity=7500
# Auditing
auditing=false
auditingTopicName=kic_dataintgrtntransactions
appintegrationauditingTopicName=kic_appintgrtntransactions
# Output
# values: kafka kafka-datapersistence file rdbms nosql
outputAdapter=file
export auditing=false
export auditingTopicName=kic_appintgrtntransactions
# Output values: kafka kafka-datapersistence file rdbms nosql
export outputAdapter=kafka-datapersistence
# Output Setting
edi_location
fhir_location
hl7_location
export edi_location=undefined
export fhir_location=undefined
export hl7_location=undefined
# Kafka Settings
kafka_server=localhost:9092
kafka_group=""
KAFKA_CONSUMER_TOPIC= ""
KAFKA_PRODUCE_TOPIC=""
kafka_client_id="1234"
export kafka_server=localhost:9092
export kafka_group=undefined
export KAFKA_CONSUMER_TOPIC= undefined
export KAFKA_PRODUCE_TOPIC=undefined
export kafka_client_id="1234"
# Database Tech
rdbms=postgreSQL
export rdbms=postgreSQL
# Postgres Database Setting
PostgreSQL_URL=postgres://postgres:Developer123@localhost:5432/datasynthesis+
export dbURL=postgres://postgres:Developer123@localhost:5432/datasynthesis
# MySQL/MariaDB Database Setting
#dbhost=127.0.0.1
#dbuser=root
#dbpassword=Developer123
#db=datasynthesis
export dbHost=127.0.0.1
export dbPort=1234
export dbUser=root
export dbPassword=Developer123
export dbName=datasynthesis
# Vendor Centric Settings
# iDaaS
iDaaS_FHIR_Server_URI=""
iDaaS_Cloud=true
iDaaS_Cloud_Kafka=
export iDaaS_FHIR_Server_URI=undefined
export iDaaS_Cloud=true
export iDaaS_DataSymthesis_Kafka=idaas_datasynthesis
```

# Pre-Requisites
# Pre-Requisites - Node v > 12
This section is intended to help with any pre-requisites and we have tried to make them as
specific to OS as we can.

Expand All @@ -61,14 +65,14 @@ brew install npm <br/>
brew install yarn <br/>
brew upgrade <package> <br/>

# Windows
## Windows
Find the download from https://nodejs.org/en/download/ and install it.

# Linux
## Linux
Depending on your flavor of Linux you will find the needed downloads
https://nodejs.org/en/download/ or within your Linux implementation.

## Node
# Node
We always prefer to be very close to the latest Node and Project releases as their are constant performance and security
enhancements occuring within the technology.

Expand All @@ -82,15 +86,10 @@ or
yarn install
```

# IDE or Command Line Experience
If you are wanting to leverage the libraries and look at the code from a development experience perspective, then either
having all the proper node

## Running in IDE
The following section is intended to cover generic IDE and platform usage. To date though as long as IDEs have been
setup and are working with Node then we have seen no issues.
# Command Line Experience
From a command line you can follow the following common commands to use the Node APIs.

### Starting the Solution
## Installing/Updating Needed Packages
Always make sure you have either install or updated the packages first:

Install:
Expand All @@ -105,6 +104,9 @@ Upgrade:
npm upgrade
```

## Starting the Solution
Always make sure you have either install or updated the packages first:

To start the solution from the command line at the project level simply type:
```
npm start
Expand All @@ -115,24 +117,24 @@ Or, if you want to work with it locally and potentially enhance it then from the
nodemon app.js
```

## Running in IDE
The following section is intended to cover generic IDE and platform usage. To date though as long as IDEs have been
setup and are working with Node then we have seen no issues.

# Implementation and Usage
The capabilities delivered through this code base are extensive, below is a series of links to help guide specific
implementation needs and usage based scenarios. Within the capabilities provided by the developed Node-APIs.




| Node Implementation Type | Description |
|--------------------------|------------------------------------------------------------------------|
|[Node APIs](Usage-Node-APIs.md) | APIs developed to provided DataSynthesis data access and functionality |
|[Node Usage](Usage-Node-Assets.md)| Assets developed to provided DataSynthesis platform. |
|[Node APIs](Usage-Node-APIs.md) | APIs developed to provided DataSynthesis data access and functionality |

# Testing APIs
To help enable resources to leverage the APIs we have pre-built and are continuing to enhance a set of PostMan APIs.
The intent is to that anyone can see how the APIs can be leveraged simply and directly.

https://www.postman.com/balanscott/workspace/datasynthesis/collection/16526170-6e45e3ca-8eaf-47c9-a0cb-0e024a852505

https://go.postman.co/workspace/DataSynthesis~6a46c0cf-955b-49b4-b495-68940fde4c31/collection/16526170-6e45e3ca-8eaf-47c9-a0cb-0e024a852505?action=share&creator=16526170

Happy Coding

78 changes: 77 additions & 1 deletion DataTier-APIs/Node-APIs/Usage-Node-Assets.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,81 @@
# Node-Assets
Within the Node-API efforts there are a set of node assets that can be run from the command line on any
machine where these are implemented.

# Pre-Requisites
- Node installed and configured to work from command line or IDE
- Based on your OS the environment variables set. We have multiple ways we have seen these implemented through
implementations.
- The code repo cloned

# Assets
The following are the command line assets that can run and what they are designed for. These assets will automatically
output to whatever is defined within the environment variable named outputAdapter.

Values for outputAdapter are: kafka kafka-datapersistence file rdbms nosql. The most commonly used and established
ones are kafka-persistence and file.

| Node Implementation Type | Description |
|------------------------------------------|---------------------------------------------------------------------|
| generatedata_dataattributes.js | Ability to generate data attriubutes for platform |
| generatedata_datastructures.js | Ability to generate data structures for platform |
| generatedata_industrystds.js | Ability to generate industry standards data from platform |
| mgmt_automtd_dataattribute_generation.js | Ability to leverage an automated data generator for data attributes |
| randomdata_queries.js | Ability to generate data structures for platform |

## Usage
In this section we will provide some specific examples, these are not exhaustive as there are several
hundred plus ways as these assets are very extensible.

### Generate Data Attributes
This provides the SAME capabilities as the API for generating data attributes found at:
/api/generatedata/generate/<attributename>?limit=xxx

There are two arguments, one is specific and required the second one if not included will be defaulted to the runQuantity
environment variable.

node generatedata_dataattributes.js <attributename> <quantity>

1. Generate accountnumbers with the included regular expression. This will use the environment variable quantity
defined within the runQuantity.

node generatedata_dataattributes.js accountnumbers

2. Generate accountnumbers with the included regular expression. This will generate 525 records.

node generatedata_dataattributes.js accountnumber 525

### Generate Data Structures
This provides the SAME capabilities as the API for generating data attributes found at:
/api/generatedata/generatedatastructures/namedstructure?count=3250&datastructurename=Person Demographics

There is only argument, the quantity generated will be based on the runQuantity environment variable.

node generatedata_datastructures.js <datastructure name>

1. Generate Person Demographics

node generatedata_datastructures.js "Person Demographics"

### Generate Industry Standards
This provides the SAME capabilities as the API for generating data attributes found at:
/api/industrystds/generator-hl7?count=100

There are two arguments, one is specific and required the second one if not included will be defaulted to the runQuantity
environment variable.

generatedata_industrystds.js <industrystd> <quantity>

1. Generate 500 HL7 Messages

generatedata_industrystds.js hl7 500

### Automated Data Attribute Generation
There is NO API that provides this capability overall, the functionality is available per data attribute within the developed
APIs; however, this is intended to be run and as long as it is running creating data attirbutes as defined within the
management subsystems. The definition also has the quantity, so it is intended to be an all encompassing record.

FYI: as of this content creation this was in place but not fully developed!!

node mgmt_automtd_dataattribute_generation.js

# Implementation/Usage
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ router.get("/addresses", async(req, res) => {

});

router.get("/phone-numbers", async(req, res) => {
router.get("/phonenumbers-us", async(req, res) => {
const number_of_phone_numbers = parseInt(req.query.count) || 1000;
const country = req.query.country || "US";
const results = dataattributesGenerator.generateUSPhoneNumbers(number_of_phone_numbers, country)
Expand Down
Loading

0 comments on commit ebba904

Please sign in to comment.