There may not be an entirely simply answer to this except, perhaps, "no".
But it rather depends on a few things. That is one pretty large csv file!
Have a look at the Help file and Appendix A which gives the Technical Specifications that need to be factored in.
I would guess you have more than 10M records but if you are hitting a 10Gb limit without exceeding 10M records you might be able to "filter" the record selection and so reduce the size of the working table. Fewer records selected in one pass or the elimination of any fields you might not really need are things that can be considered.
If, after that, your really do need all the data you could look at splitting the extraction using either small variants of the same basic model or including the split options as a User Runtime selection field entry which can be especially useful if different departments need their own subsets of the total record set.
Sorry if this all sounds a bit vague but at this point making a specific suggestion about how to best go forward is tricky without a deeper understanding of the problem you face and the options you might find acceptable for working around things.