MacOS 错误: 向量内存耗尽(达到极限?)

我正在尝试运行一个 R 脚本(特别是,我正在使用 Bioiconductor 软件包 弹弓中的“ getLineages”函数。

我想知道为什么错误“矢量内存耗尽(达到极限?)”当我使用这个函数时,它就会显示出来,因为与这个包中的其他函数(包括我正在分析的数据)相比,它似乎不是最占用内存的函数。

我知道在 Stackoverflow 还有其他类似的问题,但是他们都建议转到64位版本的 r。然而,我已经在使用这个版本了。这个问题到目前为止似乎没有其他答案,我想知道是否有人可能知道?

数据只有大约120mb 的大小,这是远远少于我的计算机的8GB 内存。

R 64 bit version

162537 次浏览

For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000), as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:

Error: vector memory exhausted (limit reached?)

After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:

Step 1: Open terminal,

Step 2:

cd ~
touch .Renviron
open .Renviron

Step 3: Save the following as the first line of .Renviron:

R_MAX_VSIZE=100Gb

Step 4: Close RStudio and reopen

Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine

I had the same problem, increasing the "R_MAX_VSIZE" did not help in my case, instead cleaning the variables no longer needed solved the problem. Hope this helps those who are struggling here.

rm(large_df, large_list, large_vector, temp_variables)

This can be done through R studio as well.

library(usethis)
usethis::edit_r_environ()

when the tab opens up in R studio, add this to the 1st line: R_MAX_VSIZE=100Gb (or whatever memory you wish to allocate).

Re-start R and/or restart computer and run the R command again that gave you the memory error.

I had this problem when running Rcpp::sourceCpp("my_cpp_file.cpp"), resulting in

Error: vector memory exhausted (limit reached?)

changing the Makevars file solved it for me. Currently it looks like

CC=gcc
CXX=g++
CXX11=g++
CXX14=g++
cxx18=g++
cxx1X=g++
LDFLAGS=-L/usr/lib