Hi All,
I am trying to create a script to measure packet loss for from WAN links in python. The script is ready & already deployed working as expected with the logic given. In the output there are 0-4 'timeSeries'. I am a bit confused which 'timeSeries' to pick for packet loss & which is accurate. This data will shown in a graph later used for long term analysis.
API - https://api.meraki.com/api/v0/organizations/{{}}/uplinksLossAndLatency?uplink=wan1&ip=8.8.8.8
Sample output -
[{"networkId":"",
"serial":"",
"uplink":"wan1",
"ip":"8.8.8.8",
"timeSeries":[{"ts":"2023-01-31T11:38:20Z",
"lossPercent":0.0,
"latencyMs":5.7},
{"ts":"2023-01-31T11:39:20Z",
"lossPercent":0.0,
"latencyMs":5.9},
{"ts":"2023-01-31T11:40:19Z",
"lossPercent":0.0,
"latencyMs":5.8},
{"ts":"2023-01-31T11:41:20Z",
"lossPercent":0.0,
"latencyMs":5.9},
{"ts":"2023-01-31T11:42:20Z",
"lossPercent":0.0,
"latencyMs":6.1}]}