Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory exception when downloading large files from Blob #18

Open
ibruynin opened this issue Jul 22, 2019 · 2 comments
Open

Out of memory exception when downloading large files from Blob #18

ibruynin opened this issue Jul 22, 2019 · 2 comments

Comments

@ibruynin
Copy link

ibruynin commented Jul 22, 2019

Hi

Based on application insights logs, the following method throws System.OutOfMemoryException exception when downloading files larger than 350Mb on an Azure web app running on 3.5Gb (S2 service plan)

[Edit: it's not that crazy... it's 350Mb :)]

Strathweb.AspNetCore.AzureBlobFileProvider.AzureBlobFileInfo.CreateReadStream

I'll attach the full stack trace in a separate file:
fullstacktrace.txt

I'm not sure if this is intended to be used for large files, but it sure would be a great and powerful feature if it could be used like that!

@ibruynin
Copy link
Author

and this is a snapshot of the heap

image

The large amount of memory is taken by a memorystream instance that seems to take twice the size of the file being downloaded...

@johnayling
Copy link

The way it is using a MemoryStream within the CreateReadStream method prevents it from being used for large files. Really want to return a FileStreamResult directly so that the stream acts as a pipe from storage to the client.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants