Running out of memory with MikTex The 2019 Stack Overflow Developer Survey Results Are InDisplaying memory variable valuesmatlab2tikz - TeX out of memoryCan TeX generate “heap dumps” to analyze+optimize out-of-memory situations?Running out of main memory when using loops in TikZForest with ~500 nodes runs out of TeX memoryForest causing PdfLatex to run out of memoryHow can I avoid running out of memory when creating long, multipage tcolorbox's?LuaLaTeX runs out of memoryHow to increase memory size for xelatex in MikTeXMikTex Console already running

What is preventing me from simply constructing a hash that's lower than the current target?

How do I free up internal storage if I don't have any apps downloaded?

How to display lines in a file like ls displays files in a directory?

How come people say “Would of”?

How to support a colleague who finds meetings extremely tiring?

What do these terms in Caesar's Gallic Wars mean?

How to type this arrow in math mode?

What to do when moving next to a bird sanctuary with a loosely-domesticated cat?

Does HR tell a hiring manager about salary negotiations?

Getting crown tickets for Statue of Liberty

Falsification in Math vs Science

What is the meaning of Triage in Cybersec world?

Kerning for subscripts of sigma?

Finding the area between two curves with Integrate

A word that means fill it to the required quantity

Can an undergraduate be advised by a professor who is very far away?

Dropping list elements from nested list after evaluation

Why does the nucleus not repel itself?

Why doesn't shell automatically fix "useless use of cat"?

Why not take a picture of a closer black hole?

Time travel alters history but people keep saying nothing's changed

Did any laptop computers have a built-in 5 1/4 inch floppy drive?

Is Cinnamon a desktop environment or a window manager? (Or both?)

Is it safe to harvest rainwater that fell on solar panels?



Running out of memory with MikTex



The 2019 Stack Overflow Developer Survey Results Are InDisplaying memory variable valuesmatlab2tikz - TeX out of memoryCan TeX generate “heap dumps” to analyze+optimize out-of-memory situations?Running out of main memory when using loops in TikZForest with ~500 nodes runs out of TeX memoryForest causing PdfLatex to run out of memoryHow can I avoid running out of memory when creating long, multipage tcolorbox's?LuaLaTeX runs out of memoryHow to increase memory size for xelatex in MikTeXMikTex Console already running










1















I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.



I've used



initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex


To increase the available memory up to the point I think I can...the config file now reads



pool_size=40000000 
main_memory=50000000
extra_mem_bot=40000000


And yet the log file tells me



43000001 words of memory out of 43000000


What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.



Here's code for the figure



%%%%%%%%%%%%%%%%%%%%%%
documentclasssinglecol-new
%%%%%%%%%%%%%%%%%%%%%%

usepackagepgfplots
usepgfplotslibrarygroupplots
usetikzlibrarypgfplots.groupplots
usetikzlibraryplotmarks
usetikzlibrarypatterns
usetikzlibrarycalc
usepgfplotslibraryexternal

usepackage[external]tcolorbox
tcbset
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,


pgfplotssetcompat = 1.12

tcbEXTERNALIZE
tikzexternalize



%%%%%%%%%%%%%%%%%%%%%%
begindocument
%%%%%%%%%%%%%%%%%%%%%%


beginfigure[h!]
centering
beginextikzpicture[runs=2]fig7
beginaxis[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style=draw=none,
legend style=at=(0.9,0.4),
xlabel = $fracV(A)Resource Cost$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
,
x tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
,
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]CDFs.csv;
addlegendentryscriptsize Empirical Data
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]CDFs.csv;
addlegendentryscriptsize Extended Pearson-Tukey
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]CDFs.csv;
addlegendentryscriptsize Extended Swanson-Megill
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]CDFs.csv;
addlegendentryscriptsize Bracket Median
endaxis
endextikzpicture
captionElicited CDFs
labelCDFGraph
endfigure

enddocument


Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...



The data file looks like



X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101

....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1









share|improve this question














bumped to the homepage by Community 9 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.















  • One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.

    – John Kormylo
    Sep 8 '18 at 15:37











  • @JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...

    – jerH
    Sep 10 '18 at 1:28











  • The data file, cdfs.csv, is available at pastebin.com/u/jerH

    – jerH
    Sep 10 '18 at 2:00















1















I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.



I've used



initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex


To increase the available memory up to the point I think I can...the config file now reads



pool_size=40000000 
main_memory=50000000
extra_mem_bot=40000000


And yet the log file tells me



43000001 words of memory out of 43000000


What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.



Here's code for the figure



%%%%%%%%%%%%%%%%%%%%%%
documentclasssinglecol-new
%%%%%%%%%%%%%%%%%%%%%%

usepackagepgfplots
usepgfplotslibrarygroupplots
usetikzlibrarypgfplots.groupplots
usetikzlibraryplotmarks
usetikzlibrarypatterns
usetikzlibrarycalc
usepgfplotslibraryexternal

usepackage[external]tcolorbox
tcbset
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,


pgfplotssetcompat = 1.12

tcbEXTERNALIZE
tikzexternalize



%%%%%%%%%%%%%%%%%%%%%%
begindocument
%%%%%%%%%%%%%%%%%%%%%%


beginfigure[h!]
centering
beginextikzpicture[runs=2]fig7
beginaxis[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style=draw=none,
legend style=at=(0.9,0.4),
xlabel = $fracV(A)Resource Cost$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
,
x tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
,
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]CDFs.csv;
addlegendentryscriptsize Empirical Data
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]CDFs.csv;
addlegendentryscriptsize Extended Pearson-Tukey
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]CDFs.csv;
addlegendentryscriptsize Extended Swanson-Megill
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]CDFs.csv;
addlegendentryscriptsize Bracket Median
endaxis
endextikzpicture
captionElicited CDFs
labelCDFGraph
endfigure

enddocument


Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...



The data file looks like



X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101

....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1









share|improve this question














bumped to the homepage by Community 9 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.















  • One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.

    – John Kormylo
    Sep 8 '18 at 15:37











  • @JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...

    – jerH
    Sep 10 '18 at 1:28











  • The data file, cdfs.csv, is available at pastebin.com/u/jerH

    – jerH
    Sep 10 '18 at 2:00













1












1








1








I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.



I've used



initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex


To increase the available memory up to the point I think I can...the config file now reads



pool_size=40000000 
main_memory=50000000
extra_mem_bot=40000000


And yet the log file tells me



43000001 words of memory out of 43000000


What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.



Here's code for the figure



%%%%%%%%%%%%%%%%%%%%%%
documentclasssinglecol-new
%%%%%%%%%%%%%%%%%%%%%%

usepackagepgfplots
usepgfplotslibrarygroupplots
usetikzlibrarypgfplots.groupplots
usetikzlibraryplotmarks
usetikzlibrarypatterns
usetikzlibrarycalc
usepgfplotslibraryexternal

usepackage[external]tcolorbox
tcbset
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,


pgfplotssetcompat = 1.12

tcbEXTERNALIZE
tikzexternalize



%%%%%%%%%%%%%%%%%%%%%%
begindocument
%%%%%%%%%%%%%%%%%%%%%%


beginfigure[h!]
centering
beginextikzpicture[runs=2]fig7
beginaxis[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style=draw=none,
legend style=at=(0.9,0.4),
xlabel = $fracV(A)Resource Cost$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
,
x tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
,
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]CDFs.csv;
addlegendentryscriptsize Empirical Data
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]CDFs.csv;
addlegendentryscriptsize Extended Pearson-Tukey
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]CDFs.csv;
addlegendentryscriptsize Extended Swanson-Megill
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]CDFs.csv;
addlegendentryscriptsize Bracket Median
endaxis
endextikzpicture
captionElicited CDFs
labelCDFGraph
endfigure

enddocument


Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...



The data file looks like



X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101

....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1









share|improve this question














I am attempting to build a number of figures with tikz...one keeps failing to build and the log file indicates I'm running out of memory.



I've used



initexmf --edit-config-file pdflatex
initexmf --dump=pdflatex


To increase the available memory up to the point I think I can...the config file now reads



pool_size=40000000 
main_memory=50000000
extra_mem_bot=40000000


And yet the log file tells me



43000001 words of memory out of 43000000


What's frustrating is that I have built this figure before on an older computer with less memory...it's cut and paste code....which makes me think maybe I've screwed something up that's letting something run away.



Here's code for the figure



%%%%%%%%%%%%%%%%%%%%%%
documentclasssinglecol-new
%%%%%%%%%%%%%%%%%%%%%%

usepackagepgfplots
usepgfplotslibrarygroupplots
usetikzlibrarypgfplots.groupplots
usetikzlibraryplotmarks
usetikzlibrarypatterns
usetikzlibrarycalc
usepgfplotslibraryexternal

usepackage[external]tcolorbox
tcbset
external/prefix=jobname-,
external/safety=0mm,
external/input source on error=false,


pgfplotssetcompat = 1.12

tcbEXTERNALIZE
tikzexternalize



%%%%%%%%%%%%%%%%%%%%%%
begindocument
%%%%%%%%%%%%%%%%%%%%%%


beginfigure[h!]
centering
beginextikzpicture[runs=2]fig7
beginaxis[
height=8cm,
width=8cm,
xmin=0,
xmax=10000,
%legend style=draw=none,
legend style=at=(0.9,0.4),
xlabel = $fracV(A)Resource Cost$ Ratio,
ylabel = P(X),
width=0.75textwidth,
y tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=1,
/tikz/.cd
,
x tick label style=
/pgf/number format/.cd,
fixed,
fixed zerofill,
precision=0,
/tikz/.cd
,
scaled ticks=false,
]
addplot+[black, mark=o,line join=round, mark repeat=1000] table[col sep=comma, y=Empirical, x=X]CDFs.csv;
addlegendentryscriptsize Empirical Data
addplot+[black, mark=x,line join=round, mark repeat=1000] table[col sep=comma, y=PT, x=X]CDFs.csv;
addlegendentryscriptsize Extended Pearson-Tukey
addplot+[black, mark=|,line join=round, mark repeat=1000] table[col sep=comma, y=SM, x=X]CDFs.csv;
addlegendentryscriptsize Extended Swanson-Megill
addplot+[black, mark=square,line join=round, mark repeat=1000] table[col sep=comma, y=BM, x=X]CDFs.csv;
addlegendentryscriptsize Bracket Median
endaxis
endextikzpicture
captionElicited CDFs
labelCDFGraph
endfigure

enddocument


Obviously the data is important here, but I'm not sure how to include it. It's 10,002 rows by 5 columns of floating point numbers to make 4 line graphs...it's honestly one of the simplest graphs in this paper...



The data file looks like



X Empirical PT BM SM
0 0 0.001 0.001 0.001
1 0 0.001 0.001 0.001
2 0 0.001 0.001 0.001
3 0 0.001 0.001 0.001
4 0 0.001 0.001 0.001
5 0 0.001 0.001 0.001
6 0 0.001 0.001 0.001
7 0 0.001 0.001 0.001
8 0 0.001 0.001 0.001
9 0 0.00101 0.001 0.001
10 0 0.00101 0.001 0.001
11 0 0.00101 0.001 0.001
12 0 0.00101 0.00101 0.00101

....
9990 1 1 1 1
9991 1 1 1 1
9992 1 1 1 1
9993 1 1 1 1
9994 1 1 1 1
9995 1 1 1 1
9996 1 1 1 1
9997 1 1 1 1
9998 1 1 1 1
9999 1 1 1 1
10000 1 1 1 1






miktex tikz-external memory






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Sep 7 '18 at 18:17









jerHjerH

440413




440413





bumped to the homepage by Community 9 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.







bumped to the homepage by Community 9 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.














  • One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.

    – John Kormylo
    Sep 8 '18 at 15:37











  • @JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...

    – jerH
    Sep 10 '18 at 1:28











  • The data file, cdfs.csv, is available at pastebin.com/u/jerH

    – jerH
    Sep 10 '18 at 2:00

















  • One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.

    – John Kormylo
    Sep 8 '18 at 15:37











  • @JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...

    – jerH
    Sep 10 '18 at 1:28











  • The data file, cdfs.csv, is available at pastebin.com/u/jerH

    – jerH
    Sep 10 '18 at 2:00
















One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.

– John Kormylo
Sep 8 '18 at 15:37





One sure way to fix the problem is to do all your tikzpictures using standalone or externalize. If you run out of memory for a single graph, you are using too many data points.

– John Kormylo
Sep 8 '18 at 15:37













@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...

– jerH
Sep 10 '18 at 1:28





@JohnKormylo I am running them externalized, and though there are a lot of points (10,000 per series) I have built this graph before...just not sure why it's running out of memory now. Other charts that in the past gave me memory issues are working fine, but this one never caused a problem before...

– jerH
Sep 10 '18 at 1:28













The data file, cdfs.csv, is available at pastebin.com/u/jerH

– jerH
Sep 10 '18 at 2:00





The data file, cdfs.csv, is available at pastebin.com/u/jerH

– jerH
Sep 10 '18 at 2:00










2 Answers
2






active

oldest

votes


















0














Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...






share|improve this answer























  • No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little

    – KJO
    Oct 20 '18 at 1:03



















0














Having a need to bump memory in MiKTeX



I searched around and collected the following recommendations which allowed me to revisit and run this plot.



 > initexmf --edit-config-file=pdfLaTeX


In notepad change or if no prior value add the following lines



main_memory=12000000
extra_mem_bot=99999999
font_mem_size=3000000


save the file and back at prompt run



> initexmf --dump=pdflatex


If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)



previously the points required

2008247 words of memory out of  3000000 (default) for 25%

2938707 words of memory out of  3000000 (default) for 45%



However the final working log shows for the full 10000 points 100%

  793011 words of memory out of 12000000

approx. 10% of expected ! (guess it gets partially cleared during run time)






share|improve this answer

























    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "85"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2ftex.stackexchange.com%2fquestions%2f449892%2frunning-out-of-memory-with-miktex%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...






    share|improve this answer























    • No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little

      – KJO
      Oct 20 '18 at 1:03
















    0














    Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...






    share|improve this answer























    • No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little

      – KJO
      Oct 20 '18 at 1:03














    0












    0








    0







    Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...






    share|improve this answer













    Interestingly, I uninstalled and then re-installed the pgf package and this issue disappeared. Not sure if that counts as an "answer"...







    share|improve this answer












    share|improve this answer



    share|improve this answer










    answered Sep 19 '18 at 20:59









    jerHjerH

    440413




    440413












    • No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little

      – KJO
      Oct 20 '18 at 1:03


















    • No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little

      – KJO
      Oct 20 '18 at 1:03

















    No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little

    – KJO
    Oct 20 '18 at 1:03






    No I tried un/install both pgf and pgfplots it did not make a jot of difference the only way it works for me is to have 25% of the .csv giving 2008247 words of memory out of 3000000 (the default) so could perhaps run say 4500 points max? update tried that and 4500 = 2938707 words of memory out of 3000000 so ok up to then but 5000 failed not sure if it would be much simpler to reduce data to every second entry? and bump memory a little

    – KJO
    Oct 20 '18 at 1:03












    0














    Having a need to bump memory in MiKTeX



    I searched around and collected the following recommendations which allowed me to revisit and run this plot.



     > initexmf --edit-config-file=pdfLaTeX


    In notepad change or if no prior value add the following lines



    main_memory=12000000
    extra_mem_bot=99999999
    font_mem_size=3000000


    save the file and back at prompt run



    > initexmf --dump=pdflatex


    If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)



    previously the points required

    2008247 words of memory out of  3000000 (default) for 25%

    2938707 words of memory out of  3000000 (default) for 45%



    However the final working log shows for the full 10000 points 100%

      793011 words of memory out of 12000000

    approx. 10% of expected ! (guess it gets partially cleared during run time)






    share|improve this answer





























      0














      Having a need to bump memory in MiKTeX



      I searched around and collected the following recommendations which allowed me to revisit and run this plot.



       > initexmf --edit-config-file=pdfLaTeX


      In notepad change or if no prior value add the following lines



      main_memory=12000000
      extra_mem_bot=99999999
      font_mem_size=3000000


      save the file and back at prompt run



      > initexmf --dump=pdflatex


      If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)



      previously the points required

      2008247 words of memory out of  3000000 (default) for 25%

      2938707 words of memory out of  3000000 (default) for 45%



      However the final working log shows for the full 10000 points 100%

        793011 words of memory out of 12000000

      approx. 10% of expected ! (guess it gets partially cleared during run time)






      share|improve this answer



























        0












        0








        0







        Having a need to bump memory in MiKTeX



        I searched around and collected the following recommendations which allowed me to revisit and run this plot.



         > initexmf --edit-config-file=pdfLaTeX


        In notepad change or if no prior value add the following lines



        main_memory=12000000
        extra_mem_bot=99999999
        font_mem_size=3000000


        save the file and back at prompt run



        > initexmf --dump=pdflatex


        If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)



        previously the points required

        2008247 words of memory out of  3000000 (default) for 25%

        2938707 words of memory out of  3000000 (default) for 45%



        However the final working log shows for the full 10000 points 100%

          793011 words of memory out of 12000000

        approx. 10% of expected ! (guess it gets partially cleared during run time)






        share|improve this answer















        Having a need to bump memory in MiKTeX



        I searched around and collected the following recommendations which allowed me to revisit and run this plot.



         > initexmf --edit-config-file=pdfLaTeX


        In notepad change or if no prior value add the following lines



        main_memory=12000000
        extra_mem_bot=99999999
        font_mem_size=3000000


        save the file and back at prompt run



        > initexmf --dump=pdflatex


        If you get an error message you need to repeat and reduce values until --dump=pdflatex does not error (for speed use a "binary chop", half last difference)



        previously the points required

        2008247 words of memory out of  3000000 (default) for 25%

        2938707 words of memory out of  3000000 (default) for 45%



        However the final working log shows for the full 10000 points 100%

          793011 words of memory out of 12000000

        approx. 10% of expected ! (guess it gets partially cleared during run time)







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Nov 13 '18 at 2:30

























        answered Nov 13 '18 at 2:23









        KJOKJO

        3,7241222




        3,7241222



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to TeX - LaTeX Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2ftex.stackexchange.com%2fquestions%2f449892%2frunning-out-of-memory-with-miktex%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Isabella Eugénie Boyer Biographie | Références | Menu de navigationmodifiermodifier le codeComparator to Compute the Relative Value of a U.S. Dollar Amount – 1774 to Present.

            Lioubotyn Sommaire Géographie | Histoire | Population | Notes et références | Liens externes | Menu de navigationlubotin.kharkov.uamodifier« Recensements et estimations de la population depuis 1897 »« Office des statistiques d'Ukraine : population au 1er janvier 2010, 2011 et 2012 »« Office des statistiques d'Ukraine : population au 1er janvier 2011, 2012 et 2013 »Informations officiellesCartes topographiquesCarte routièrem

            Mpande kaSenzangakhona Biographie | Références | Menu de navigationmodifierMpande kaSenzangakhonavoir la liste des auteursm