The customizeValuesYaml.py script provides a tool to simplify the customization of the SBC CORE CNe values.yaml
customizeValuesYaml.py
---------------------
Usage: python3 customizeValuesYaml.py <values file name> <csv file>
Note: Usage of python3 or above version is recommended
This script takes two inputs.
- values.yaml file.
- metaData CSV file.
values.yaml file
---------------
The name of the values.yaml that to be updated for the customized values. Make sure the file has write permission.
Customized CSV file
-----------------
The csv file has four columns. Attribute Name, Attribute Value, Attribute data type and comments. The first row is the table header. Below is sample a snapshot of csv.
Attribute Name,Attribute Value,Attribute datatype,Comments
global.serviceAccount.name,default,str,"Service account provides an identity for processes that run in a Pod"
global.namespace,sbx-dev,str,"Namespace where the Ribbon SBC Core CNe solution has to be deployed"
global.kubernetesPlatform,ocp,str,"Platform on which the Ribbon SBC Core CNe solution has to be deployed"
global.storageClass,netapp-nfs-san,str,"Storage Class for the PVC creation"
global.scaleAttributes.sessionLoadBalancerReplicas,0,int,
global.scaleAttributes.CsMgrReplicas,0,int,
global.mediaPktInterface.pkt0.primary.networkName,ext-net-6,str,"PKT0 interface of the media(SC) pods"
global.mediaPktInterface.pkt1.primary.networkName,ext-net-6,str,"PKT1 interface of the media(SC) pods"
global.signalingPktInterface.pkt0.primary.networkName,ext-net-6,str,"PKT0 interfaces of the signaling(SLB) pods"
global.managementInterface.mgt0.primary.networkName,ext-net-6,str,"MGT interfaces of the OAM pods"
global.managementInterface.mgt0.secondary.networkName,ext-net-6,str,
global.managementInterface.mgt0.gateway,10.228.66.1,str,
global.managementInterface.mgt0.prefix,24,int,
global.managementInterface.mgt0.primaryMgmtIP,10.228.66.187,str,
global.managementInterface.mgt0.secondaryMgmtIP,10.228.66.188,str,
global.imageCredentials.registry,artifactory-tx.rbbn.com,str,
sc.configMap.actualSystemName,RBBNSC3SYS,str,
oam.RAMP.RAMPIP0,10.228.66.180,str,
oam.RAMP.RAMPClusterId,RBBNCNF123,str,
global.rbbnObs.configMap.metrics.output,"[{'server': 'prometheus', 'endpoint': '', 'port': 9001}]",list,
global.interPodSecurity.certificate.subject.organizations,"['RBBN', 'ECI']",list,
GUIDELINES
----------
Below are the guidelines for preparing input CSV file.
- Column for datatype should be populated as int, str, bool, float, list, dict respectively for integer,
string, bool, float, list and dictionary datatypes.
- Lists of values for an attribute should be comma separated and should be enclosed within square backets
and then by double quotes. Example for list of values
- global.interPodSecurity.certificate.subject.organizations,"['RBBN', 'ECI']",list,
- User can also provide List of dictionaries as below. The values will be replaced as it is in values.yaml.
- global.rbbnObs.configMap.metrics.output,"[{'server': 'prometheus', 'endpoint': '', 'port': 9001}]",list,
- Empty list and dictionary should be populated as [] and {} respectively.
- String values with multiple words should be provided within double quotes in CSV.
- True or False are the only allowed bool values. 0 and 1 are not considered as bool values.
- Any typo in attribute name will try to insert wrong attribute. So it is recommended to provide only the
attributes for which values have to be customized.
- It is recommended that the input CSV file should not provide the attribute that is not existing in
values.yaml. User must add atrributes before running the script.
- Example: If some attributes like in below lines are commented and if their values are to customized,
the attributes have to be uncommented.
# Observability Backend - Elasticsearch Credentials
elasticsearchCreds:
create: False
#user:
#key: ELASTICSEARCH_USER
#value:
#password:
#key: ELASTICSEARCH_PASSWD
#value:
- Attributes in above lines are commented. if attributes are to be added with new values, these lines have
to be uncomented in the values.yaml before running the script as below.
# Observability Backend - Elasticsearch Credentials
elasticsearchCreds:
create: False
user:
key: ELASTICSEARCH_USER
value:
password:
key: ELASTICSEARCH_PASSWD
value:
NOTE:
-----
- Leading and trailing double quotes are stripped off by the script while updating the string values.
- Empty strings in double quotes will be replaced with single quotes.
- Boolean True and False values will be changed to true and false respectively.
LIMITATIONS:
------------
The script allows inserting new attributes which is not existing in the values.yaml with a limitation. It allows updating one level of yaml dictionary. Two levels of a yaml dictionary cannot be inserted from csv into values.yaml file. The original values.yaml file should be edited to add one level.
Example : customizedResourcePrefix defining a new dictionary
- Release default parameter from one of the values files
## @param global.customizedResourcePrefix template definitions for resource name prefixes
##
customizedResourcePrefix:
servicePrefixTemplate: ""
configmapPrefixTemplate: ""
- Expected entry in the final values file:
## @param global.customizedResourcePrefix template definitions for resource name prefixes
##
customizedResourcePrefix:
servicePrefixTemplate: "svc-{{.Values.global.customizedResourcePrefix.platform.vendor_code}}-{{.Values.global.customizedResourcePrefix.platform.product_code}}-{{.Values.global.customizedResourcePrefix.platform.function_name}}"
configmapPrefixTemplate: "cm-{{.Values.global.customizedResourcePrefix.platform.vendor_code}}-{{.Values.global.customizedResourcePrefix.platform.product_code}}-{{.Values.global.customizedResourcePrefix.platform.function_name}}"
platform:
vendor_code: rbbn
product_code: sbx
function_name: mno
- Release default values file should be edited with the 'platform' dictionary so it can be populated:
## @param global.customizedResourcePrefix template definitions for resource name prefixes
##
customizedResourcePrefix:
servicePrefixTemplate: ""
configmapPrefixTemplate: ""
platform: {}
SUPPORTING TOOL:
----------------
The generateValuesYamlMetaData.py script can help to generate CSV file from a working values.yaml and this can be used as reference to write CSV file.
Usage: python3 generateValuesYamlMetaData.py <values file name> <csv file>
Note: Usage of python3 or above version is recommended
This script takes two inputs.
- Working values.yaml file you want to generate the CSV file from.
- filename for the generated CSV file.