Filters
The following table represents the comparison operators that are supported and example values. Filters are added as escaped JSON strings formatted as {"key":"<field>","operator":"<comparison_operator>","value":"<value>"}.
- 
Refer to the Log fields page for a list of fields related to each dataset. 
- 
Comparison operators define how values must relate to fields in the log line for an expression to return true. 
- 
Values represent the data associated with fields. 
The filter field has limits of approximately 30 operators and 1000 bytes. Anything exceeding this value will return an error.
- 
Filters can be connected using AND,ORlogical operators.
- 
Logical operators can be nested. 
Here are some examples of how the logical operators can be implemented. X, Y and Z are used to represent filter criteria:
- 
X AND Y AND Z - {"where":{"and":[{X},{Y},{Z}]}}
- 
X OR Y OR Z - {"where":{"or":[{X},{Y},{Z}]}}
- 
X AND (Y OR Z) - {"where":{"and":[{X}, {"or":[{Y},{Z}]}]}}
- 
(X AND Y) OR Z - {"where":{"or":[{"and": [{X},{Y}]},{Z}]}}
Filters can be set via API or the Cloudflare dashboard. Note that using a filter is optional, but if used, it must contain the where key.
Here is an example request using cURL via API:
curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logpush/jobs \--header "X-Auth-Email: <EMAIL>" \--header "X-Auth-Key: <API_KEY>" \--header "Content-Type: application/json" \--data '{  "name": "static-assets",  "output_options": {    "field_names": ["ClientIP", "EdgeStartTimestamp", "RayID"],    "sample_rate": 0.1,    "timestamp_format": "rfc3339"    "CVE-2021-44228": "true"  },  "dataset": "http_requests",  "filter": "{\"where\":{\"and\":[{\"key\":\"ClientRequestPath\",\"operator\":\"contains\",\"value\":\"/static\"},{\"key\":\"ClientRequestHost\",\"operator\":\"eq\",\"value\":\"example.com\"}]}}",  "destination_conf": "s3://<BUCKET_PATH>?region=us-west-2/"}'To set filters through the dashboard:
- Log in to the Cloudflare dashboard ↗ and select the domain you want to use.
- Go to Analytics & Logs > Logs.
- Select Add Logpush job. A modal window will open.
- Select the dataset you want to push to a storage service.
- Below Select data fields, in the Filter section, you can set up your filters.
- You need to select a Field, an Operator, and a Value.
- You can connect more filters using ANDandORlogical operators.
- Select Next to continue the setting up of your Logpush job.