Thanks for kind words Kenny!
We found Go to be an excellent choice for this type of tool, as we could make very good use of the built-in concurrency primitives, and also of course benefit from the general robustness, performance and the tooling around the language.
We have used SciPipe in our latest study at pharmb.io , on building machine learning models to predict hazard in drug molecules (see dx.doi.org/10.3389/fphar.2018.01256). I have since transitioned into industry for the moment, but plan on using SciPipe on some research projects on my own, and also my colleagues at pharmbio are interested in applying it on some upcoming projects. Hoping to see some more people play with / adopting it. :)
We have used SciPipe in our latest study at pharmb.io , on building machine learning models to predict hazard in drug molecules (see dx.doi.org/10.3389/fphar.2018.01256). I have since transitioned into industry for the moment, but plan on using SciPipe on some research projects on my own, and also my colleagues at pharmbio are interested in applying it on some upcoming projects. Hoping to see some more people play with / adopting it. :)
Btw, I should also mention that although SciPipe was created out of needs in bioinformatics and cheminformatics, I have found it to be a handy tools also for tasks that are very general in terms of data processing, especially since it allows an easy way to mix and match various GNU *nix commandline utilities with components written in Go. One example of this is how we wrote a small pipeline to download, unzip and parse a somewhat large XML dataset, using commandline tools combined with Go code for the actual XML parsing (See the code example at the bottom of this post: bionics.it/posts/parsing-drugbank-xml-or-any-large-xml-file-in-streaming-mode-in-go ).