cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
JMP Wish List

We want to hear your ideas for improving JMP software.

  1. Search: Please search for an existing idea first before submitting a new idea.
  2. Submit: Post your new idea using the Suggest an Idea button. Please submit one actionable idea per post rather than a single post with multiple ideas.
  3. Kudo & Comment Kudo ideas you like, and comment to add to an idea.
  4. Subscribe: Follow the status of ideas you like. Refer to status definitions to understand where an idea is in its lifecycle. (You are automatically subscribed to ideas you've submitted or commented on.)

We consider several factors when looking for what ideas to add to JMP. This includes what will have the greatest benefit to our customers based on scope, needs and current resources. Product ideas help us decide what features to work on next. Additionally, we often look to ideas for inspiration on how to add value to developments already in our pipeline or enhancements to new or existing features.

Choose Language Hide Translation Bar
0 Kudos

Add a "max threads" argument for new multi http request()

to limit the number of simultaneous HTTP calls.

 

Something like this

 

custom_multi_http = Function( {mm, max = 30}, 
//mm=multi_http_requests;
	//max=5;

	res = {};
	status = {};
	mm_get = mm << get requests;
	chunks = Ceiling( N Items( mm_get ) / max );
	chunk_list = {};
	For( i = 1, i <= chunks, i++,
		chunk_i = New Multi HTTP Request();
		For( j = 1 + (i - 1) * max, j <= Min( max * i, N Items( mm_get ) ), j++,
			chunk_i << Add( mm_get[j] )
		);
		Wait( 0 );
		chunk_i_requests = chunk_i << get requests;
		
		Insert Into( res, chunk_i << Send() );
		Insert Into( status, chunk_i_requests << get status() );

	);


	Return( res, status );
);
7 Comments
Status changed to: Acknowledged

Hi @andrewtkarl, thank you for your suggestion! We have captured your request and will take it under consideration.

mia_stephens
Staff
Status changed to: Investigating
 
mia_stephens
Staff

@andrewtkarl , the developer shared this:

 

Currently a user can write a script using HTTP Request with Send(asnyc) that will return a JSL promise. They can track the number of promises, which will be the concurrent connections.

When a promise has finished (with onResult or onError) they can queue up more requests.

 

Here’s an example of handling the maximum concurrent HTTP Requests using HTTP Request and JSL Promises shared by @bryan_boone :

 

Names Default To Here( 1 );
//populate the requests
requests = {};
promises = {};
count = 100;
max = 30;


For(i = 1, i <= count, i++,
	request = New HTTP Request( //this is the HTTP Request for the Query String Example in the SI
		url( "http://httpbin.org/get" ),
		Method( "GET" ),
		Query String( [["username" => "bob", "address" => "12345", "count"=>i]] )
	);
	InsertInto(requests, request);
);

//now send them async
process_result = Function( {p, r},
	//Show(p);
	//Show( r );
	if(N Items(promises),
		RemoveFrom(promises, 1);
	);
);

process_error = Function( {p, e, t},
	//Show(p);
	//Show( e );
	//Show( t );
	if(N Items(promises),
		RemoveFrom(promises, 1);
	);
);


while(N Items(requests),
	count = max - N Items(promises);
	if(count > N Items(requests),
		count = N Items(requests);
	);
	Show(count);
	For(i = 1, i <= count, i++,
		request = requests[1];
		RemoveFrom(requests, 1);
		promise = request << Send( "async" );
		InsertInto(promises, promise);
		promise << On Result( process_result );
		promise << On Error( process_error );
	);
	If(Nitems(promises),
		wait(0);
	);
);
//catch any stragglers
while(Nitems(promises),
	wait(0);
);

Write("\!nDone!!!\!n");
mia_stephens
Staff

@andrewtkarl , does @bryan_boone's response satisfy this request? 

andrewtkarl
Level IV

@mia_stephens Yes it does, thank you both for your help.

mia_stephens
Staff

Thanks for the confirmation @andrewtkarl. Will mark this as delivered.

mia_stephens
Staff
Status changed to: Delivered