Deferred & Remote Function Execution in R

What would you say if you could automatically wrap your R pipeline – which consists of numerous functions and variables – into a single function? What would you say if you could do it repeatedly, with no extra effort regardless of its complexity? Would you store it to document your progress at that particular moment in

Inspecting R in GDB (with Python)

Today I spent a few hours debugging a hanging R process that left a zombie sh which so far suggests bug (race condition?) in R’s system2() call. Anyway, it soon turned out that the only way to see what’s happening with R is to use gdb, which I personally dread. It is so because I haven’t found

(A Very) Experimental Threading in R

I’ve been trying to find a way to introduce threads to R. I guess there can be many reasons to do that, among which I could mention simplified input/output logic, sending tasks to the background (e.g. building a model asynchronously), running computation-intensive tasks in parallel (e.g. parallel, chunk-wise var() on a large vector). Finally, it’s

A new `subprocess` package for R

Here’s a new package that brings to R new API to handle child processes – similar to how Python handles them. Unlike the already available system() and system2() calls from the base package or the mclapply() function from the parallel package, this new API is aimed at handling long-lived child processes that can be controlled by the parent R process in a

Subprocess in R?

I’ve been trying to find a R equivalent to Python’s subprocess and so far I’ve failed. Since a capability to handle child processes in a way that’s more sophisticated than a simple system() call (R has two of them, system() and system2()) might turn out handy I decided to build a new package for R. It’s name is