This workflow example will take you further from workflow 1 and help you create a new custom rule using the 'Test a custom rule configuration'
In Workflow 1, we saved a custom rule straight to the organization as a demonstration
- this is different from the regular or recommended development flow. Instead, you
should make use of testing a rule with dummy data prior to saving. This workflow will
show how to test a custom rule by passing a rule configuration in the request body
and returning an output from the existing account resource data.
-
Duplicate the POST query named ‘Test saved rule’ and rename ‘Test configuration’.
-
Modify the request body by adding an accountId field and moving the custom rule details under configuration:
{ "accountId": "a0b1c2d3-e4f5-a6b7-c8d9-e0f1a2b3c4d5", "configuration": { "name": "S3 bucket has any Encryption", "description": "We want to demonstrate Custom Rules V1", "categories": ["security"], "riskLevel": "MEDIUM", "provider": "aws", "enabled": true, "service": "S3", "resourceType": "s3-bucket", "remediationNote": "To remediate, follow these steps:\n1. Step one \n2. Step two\n", "attributes": [ { "name": "bucketEncryption", "path": "data.Encryption", "required": true } ], "eventRules": [ { "conditions": { "all": [ { "fact": "bucketEncryption", "operator": "notEqual", "value": null } ] }, "description": "Bucket has encryption enabled" } ] } } -
Click save and send. The response will be the check outcomes of the rule configuration against the resource data in the chosen account. In the above example, you should see a FAILURE or SUCCESS check for each S3 bucket. In workflow 1, we used the Test run feature to run a rule we already saved to the account, but this alternate approach allows you to speed up the development and testing process without the need to save the configuration first.
-
As a test, change the "operator" from "notEqual" to "equal" and click send again - you will see the check results change based on the new rule logic.
Building a new custom rule for another service
So far, we have only used AWS S3 for our examples. To build a custom rule for another
platform and/or service, it is recommended to first build a simple ‘dummy’ rule configuration
and combine it with the run endpoint and resourceData=true, so you can learn the structure
of the resource data and inform your path definitions. You will need a few parameters
to get the resource data, including a resourceId from the chosen service, as well
as the descriptorType value which is equivalent to resource-types in Cloud Risk Management data.
The following example uses Azure Virtual Machines data. You must have an Azure subscription
hosting an Azure Virtual Machine resource integrated with Cloud Risk Management for this example, but the process could be applied to any service or resource type
that is supported by Cloud Risk Management.
-
Refer to the following link to reference possible values for service and descriptorType (descriptorType in Custom Rules framework maps to values for "resource-types" from the resource-types endpoint):
-
Using ctrl+f or cmd+f identify the resource-types values for an Azure Virtual machine. You should identify the following from the previous endpoint:
{ "type": "resource-types", "id": "virtual-machines", "attributes": { "name": "Virtual Machine", "provider": "azure" }, "relationships": { "service": { "data": { "type": "services", "id": "VirtualMachines" } } } } -
To create a rule against Azure data, first run the Checks API to get an example resource ID. Create a new GET API query called ‘get azure check data’. In the TMV1-Filter header, include the accountIds field with your chosen Azure subscription (you can get this by re-running the 'list accounts' query). Make sure that you use top to limit the response.
-
Save and send the above GET query, and note the value for an example resourceId for a given check where the descriptorType = virtual-machine. For Azure virtual machines, you will likely see a long resourceId like "/subscriptions/1abc1234-1234-1234-1234-abcd1d821234/resourceGroups/my-resource-group/providers/Microsoft.Compute/virtualMachines/my-special-virtual-machine"
-
Create a new POST Test custom rule configuration query. Make sure accountId has the value of the Azure subscription that you’d like to use from the List all cloud accounts API.
-
For the body of the Test custom rule configuration POST query, construct a simple dummy rule using the appropriate resourceId, service and descriptorType from the data. The following example is a rule that checks if the resourceId field is populated, a simple proxy check for whether the data exists - which is sufficient for the goal of returning the resource data:
{ "accountId": "a0b1c2d3-e4f5-a6b7-c8d9-e0f1a2b3c4d5", "configuration": { "name": "Check if resource exists", "description": "Simple check if resource data exists for given resource", "resourceId": "/subscriptions/27b11718-e2c4-4336-b3d6-ac291d8299d3/resourceGroups/CFX-WALLACE-RG/providers/Microsoft.Compute/virtualMachines/double-encrypted-vm", "service": "VirtualMachines", "resourceType": "virtual-machines", "riskLevel": "LOW", "enabled": true, "provider": "azure", "categories": ["security"], "remediationNote": "Check if resource exists", "attributes": [ { "name": "exists", "path": "resourceId", "required": true } ], "eventRules": [ { "conditions": { "all": [ { "fact": "exists", "operator": "notEqual", "value": null } ] }, "description": "Resource exists" } ] } } - Click save and send. The response should be something like:
{ "resourceId": "/subscriptions/27b11718-e2c4-4336-b3d6-ac291d8299d3/resourceGroups/CFX-WALLACE-RG/providers/Microsoft.Compute/virtualMachines/double-encrypted-vm", "region": "global", "status": "SUCCESS", "description": "Virtual Machine double-encrypted-vm passed 'Resource exists' rule condition.", "extraData": [ { "name": "successEvent", "label": "Passed Condition Event", "value": "Resource exists", "type": "META" } ] }
