If you have run into an issue where you notice that the rawRequest or rawResponse sections in the analytics record do not contain any data then kindly proceed with the following steps.
1. Confirm value in rawRequest and rawResponse fields are empty
Can you confirm that you can see the request and gateway metadata of the analytics record? I have attached a screenshot below via the log browser in the dashboard
Analytics record via the log browser in the dashboard
Alternatively, you can use the dashboard Get Logs REST API to check the analytics record retrieved by the dashboard.
Analytics record via the dashboard REST API
curl --location 'http://<DashboardURL>:3000/api/logs?p=0' \
--header 'Authorization: <DashboardApiCredentialsGoesHere>' \
Result
{
"data": [
{
"Method": "GET",
"Host": "httpbin.org",
"Path": "/anything/result",
"RawPath": "/anything/result",
"ContentLength": 0,
"UserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/113.0.0.0 Safari/537.36",
"ResponseCode": 200,
"APIKey": "00000000",
"TimeStamp": "2023-05-26T12:57:49.769Z",
"APIVersion": "Non Versioned",
"APIName": "Http Bin API",
"APIID": "b9cfbfa2abeb404e620cd02ed858ad13",
"OrgID": "645c216b2e6d3a0001d17725",
"OauthID": "",
"RequestTime": 4232,
"RawRequest": "",
"RawResponse": "",
"IPAddress": "xxx.xxx.xxx.x",
"Geo": {},
"Tags": ["key-00000000","org-645c216b2e6d3a0001d17725","api-b9cfbfa2abeb404e620cd02ed858ad13"]
},
If you cannot see any data in your log browser, then kindly visit our troubleshooting guide on dashboard not showing any analytics data for more information
2. Check the key, api or gateway scope for detailed recording
If the values are empty, then a common error of not seeing the detailed request and response is that detailed recording has been disabled. There are three scopes where this can be enabled or disabled:
- API scope - if enabled, record detailed logs else go to the next step
- Key scope - if enabled, record detailed logs else go to the next step.
- Global / gateway (cross organisation) scope *
You can verify which scope your environment has enabled by looking for the field enable_detailed_recording
in the key definition, API definition or gateway configuration.
Note
Please be aware that enabling detailed analytics is an expensive process. We
only recommend it during testing or debugging. Also, don't enable detailed
analytics for all the scopes. Each scope has different levels of priorities as
shown above.
3. Inspect your pump config for the omission of detailed recording *
If you have audited that either the gateway, key and API definition are not the possible causes, then you can go on to investigate Tyk pump. Pump is an analytics purger that sends analytics data to the desired data sink. e.g. mongo, sql, kafka, moesif, splunk, datadog etc. Within the base configuration of all pumps is the omit_detailed_recording field. You may want to validate that this field has not been set to true
.
4. Ensure you are not hitting your DB document size limit *
If you are using Mongo DB as your data sink, then please know that there is a document size limit of 16MB when storing analytics records. If using cosmos DB then you may be hitting the 2MB limit unless you may have already expanded it to 16MB. To mitigate this, you may need to verify and shrink the size of your payload body. If all the data is not needed from the response then you could trim the data by setting a maximum record size via pump. This isn't a perfect solution nor may it be desirable.
Additionally, there is an option to skip the entire analytics data altogether if the threshold has been surpassed by setting the ...max_document_size_bytes
pump config field.
tyk-pump: time="Sep 6 16:42:37" level=warning msg="Document too large, skipping!
5. Examine pump/mdcb logs in debug mode *
If you are not hitting the MongoDB document size limit or you are not using MongoDB as your data sink, then you may have to troubleshoot pump or mdcb to investigate further. Enabling debug mode via the log_level config would allow debug logs to be emitted during the purge operation. More information about debug logging can be found in our guide on enabling debug mode for Tyk components.
A snippet of a pump debug log before being purged to your configured pump
2023-05-26 12:58:32 time="May 26 11:58:32" level=debug msg="Decoded Record: ..."
You should be able to verify the value of the rawRequest
and rawResponse
fields from the decoded analytics record. The order or sequence of the analytics record can be found here.
An alternative to enabling debug logs is to configure a stdout pump to observe the analytics being purged.
A snippet of an output from the stdout pump
2023-05-26 11:09:36 time="May 26 10:09:36" level=info prefix=stdout-pump tyk-analytics-record="{...}"
2023-05-26 11:09:36 time="May 26 10:09:36" level=info msg="Purged 1 records..." prefix=stdout-pump
6. Analyse the analytics records in Redis before pump purge *
If all fails then verify that the full information about the analytics record is sent to Redis before being purged by pump. You may have to check the size of your Redis DB and examine the record itself.
Keys | Description |
* | Applies to only hybrid cloud edge gateways. Global/pure cloud deployments are exempt from this as the logs, config and Redis are not available to the user. |
Comments
0 comments
Please sign in to leave a comment.