Question

Roger Chen -o- on Fri, 06 Mar 2015 08:46:16


Hi All~
  this the problem i meet in using Azure TableStorage Query ,  250K entities in one Query and then i do "toList" call.

steps
1. IEnumerable<MyObject> results = table.ExecuteQuery<MyObject>(queryString) ... done in few microseconds no mater in local or remote test.
2. results.toList() ....it takes several seconds in local testing. But when i do remote query, My God ... it takes hours.

  Any Suggestion ?? or TableStorage not recommended to use in remote query when size of target entities greater than particular size.

ps. remote query : run my code in NB, which access Azure TableStorage that i rented from MS.
ps2. i curious that step 1 implements real query in remote ? it's too fast to believe that search and transfer data is done  in few microseconds .


Sponsored



Replies

Manu Rekhar on Fri, 06 Mar 2015 17:51:38


Hi,

Thank you for posting in here.

We are checking on this and will get back at earliest.

Regards,
Manu Rekhar

Manu Rekhar on Thu, 12 Mar 2015 19:05:56


Hi Roger,

Sorry for the delay.

The web request calls against your table storage, in your example will execute on line #2, the results.toList() call.

You can use a fiddler trace while you step through the code to see which lines of code make calls out to your Azure table storage. If your query ends up trying to pull back all 250K entities then it is likely that you will see slower response against Azure Table Storage versus your local development storage.

In order to fully diagnose your issue, investigate the performance of your query, use or primary keys, etc, it is recommended that you open a full support case with Microsoft Support where you can get further assistance in investigating the performance of that call.

Regards,

Manu Rekhar

Roger Chen -o- on Fri, 20 Mar 2015 03:21:57


Hi Manu~
  thanks for the reply , i did get some help from local MS . And yes , some coding
technic improve the speed , for ex. lose some weight by change the return entities as json.