10x your ML models using AWS databases and an EC2 instance.
I only have 4 cores on my laptop. That is severly restrictive when running an ML or stochastic model.
To speed things along, I upload my data to an AWS database and train my models in R in an AWS cloud computing instance. I find these are complementary workflows that ensure my data is securely hosted and my models exectute quickly.
While this workflow costs a few dollars, this is well worth it to me as it only takes a minute or less to execute most models.
library(RMariaDB)
mydata <- mtcars; mydata <- as.data.frame(mydata)
con <- dbConnect(MariaDB(),
user = 'username',
password = 'password',
host = 'mydbinstance.xxx.region.rds.amazonaws.com', #copy "Endpoint" from database page
dbname='dbname')
dbWriteTable(conn = con, name = 'carsdata', value = mydata, overwrite = TRUE)
You may also want to install [MySQL Workbench] (https://dev.mysql.com/doc/refman/8.0/en/) for a user-friendly SQL GUI. You can use MySQL Workbench to create new databases and keep them organized.
That’s the end of part 1.
mydata <- dbReadTable(conn = con, name = 'carsdata', value = mydata, overwrite = TRUE)
Then, run your models at 64 cores.
Ultimately, you may want to download your trained models for further analysis/report creation.