Concatination of JSON output from pagination

diablo24
Building a reputation

Concatination of JSON output from pagination

Hi,

 

I'm using pagination to get all the networks and I'm getting 5 page (or 4190) networks. I'm having a problem putting the output back to one big JSON output to process. It appears that each page is on JSON array. If I use JsonSlupper to process each page, I end up dealing with one network at a time. My app is groovy so I can't use the Python libs that has a bunch of cool features to deal with this. I've tried using JsonBuilder, JsonOutput to try and rebuild the output into a single JSON object but I've been unsuccessful. If I try and use a String and concatenate all the pages and use JsonSlupper, I always only get 1000 networks. I tried pulling the data out of the JSON array and then concatenate in the String to process I get errors with JSON needs to start with '[' or '{'. 

 

Any help would be greatly appreciated.

-Jerome 

8 REPLIES 8
mlefebvre
Building a reputation

Can you show your example code please?

 

Edit: on second thought, the endpoint accepts a perPage parameter, why not use that to get everything on one go?

diablo24
Building a reputation

try {
            def v1networkUrl = 'api/v1/organizations/' + orgId  + '/networks'
            def networks = processHttpGet(commsHandler, v1networkUrl, metrics)
            if (networks instanceof Collection){
                //StringBuilder netData = new StringBuilder()
                def netData = ''
                networks.each { network ->
                    def jsonOutput = new JsonSlurper().parseText(network)
                    jsonOutput.each { json ->
                        //netData.append(json)
                        netData += json
                        numberOfNetworks.incr()
                    }
                }
                //def jsonNetData = [netData.toString()]
                //networkData = JsonOutput.prettyPrint(jsonNetData.toString())
                def data = new JsonBuilder().toPrettyString()
                networkData = JsonOutput.toJson(netData)
                configs.addSection(orgId + '_networks', networkData)
            } else {
                configs.addSection(orgId + '_networks', networks)
                numberOfNetworks.incr()
            }
        } catch (Exception e){
            log.error('Error encountered while collecting Networks ' + e.getCause())
            log.debug e.stackTrace
            networkError.incr()
        }
diablo24
Building a reputation

I have a lot of code commented out as I've been trying all kinds of ways to make this work.

mlefebvre
Building a reputation

I recalled right after I posted that there is a perPage parameter and I use that for simplicity myself

 

 

def getNetworksByOrganizationId(ORG_ID) {
final MERAKI_API_KEY = "*********************************"
def cmd = ['bash', '-c', "curl -G -d 'perPage=20000' -L -H 'X-Cisco-Meraki-API-Key: ${MERAKI_API_KEY}' -H 'Content-Type: application/json' -H 'Accept: application/json' --url https://api.meraki.com/api/v1/organizations/${ORG_ID}/networks".toString()]
def result = cmd.execute().text
def slurper = new JsonSlurper()
def json = slurper.parseText(result)
def nets = new ArrayList()
for (net in json) {
nets.add(net)
}
return nets.join(',')
}

 

diablo24
Building a reputation

Thanks @mlefebvre Let me try this.

RaphaelL
Kind of a big deal
Kind of a big deal

This is how I handle my pagination. I don't use the Meraki SDK since I can't install in on my corporate desktop 🙂 

 

def getswitchports():
	pages = 0
	results = []
	geturl = '{0}/organizations/{1}/switch/ports/bySwitch'.format(str(base_url_v1), str(orgid))
	dashboard = requests.get(geturl, headers=headers,verify=False)
	if dashboard.status_code == 200:
		raw = dashboard.json()
		for i in raw:  
			results.append(i)
		while 'next' in dashboard.links :
			dashboard = requests.get(dashboard.links['next']['url'],headers=headers,verify=False,timeout=25)
			raw = dashboard.json()
			pages = pages + 1
			for i in raw:  
				results.append(i)
	print(pages)
	return (results)
diablo24
Building a reputation

Thanks @RaphaelL I have that part working. What I was referring to is processing the data after I have all the pages.

RaphaelL
Kind of a big deal
Kind of a big deal

In my snipet , all the pages are appended and returned as the variable ''results''.

 

Works like a charm for our 1900 networks and 4800 switches

Get notified when there are additional replies to this discussion.