Last post Jul 29, 2020 10:43 AM by vsetia
Jul 17, 2020 11:32 PM|hilal_w|LINK
Jul 18, 2020 03:11 PM|bruce (sqlwork.com)|LINK
Webapi is not really designed for large datasets. You will need to stream the output, and update the timeouts. You might want to look at gRPC which is a better approach
note: if you need to support legacy systems it may not be an option.
Jul 22, 2020 05:38 AM|yogyogi|LINK
Large amount of data from web api ?
You design your api to send data in page by page format. This means the client will also send the
page no and
no of records per page when making the request to the API.
The API's job will be to do
custom paging on it's end and get that data which the client is asking for. This is what APIs of large corporations employ.
Jul 28, 2020 11:53 AM|vsetia|LINK
For large amount of data it is suggested to implement batch jobs, using SSIS or some other ETL tool and store all resultant data in one place. Then use that data to generate reports. You can run that batch Job Every 30 min or so that suits your business
Jul 28, 2020 12:27 PM|hilal_w|LINK
Jul 28, 2020 02:33 PM|vsetia|LINK
You can refer to online tutorials here
You can read everything from development to deployment.
You can watch YouTube tutorials and there are several courses available out there :)
Jul 28, 2020 03:18 PM|vsetia|LINK
You can place your final consolidated data generated from SSIS/ETL as need by others at one central location. This data can be in the form of CSV available at one endpoint. Does that make things easy for you ?
Jul 29, 2020 10:43 AM|vsetia|LINK
Is there anything else for which i can help you ?