Skip to content

Memory Issue #23

@ghost

Description

I am running into a memory issue with drandomForest. I receive this message which we have all seen before: Error in drandomForest.default(m, y, ..., ntree = ntree, nExecutor = nExecutor, :
cannot allocate vector of size 7.9 Gb

function call:
tree.result <- drandomForest( Target~., data=as.data.frame(signalDF), mtry=predictors, nExecutor = 4)

Executing on a 64-bit machine with 32 Gb memory. # predictors = 64, and the the dataset contains ~ 4 million rows. The size of the data is slightly over 2GB. We are using the function in classification mode ( y is set as a factor). Predictors are a combination of numeric vars and factors.

Please advise. Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions