Investor – Portfolio Optimization

Portfolio optimization is the process of selecting the best portfolio (asset distribution), out of the set of all portfolios being considered, according to some objective. The objective typically maximizes factors such as expected return, and minimizes costs like financial risk. In this Blog we demonstrate how the Invstor API can be used to determine an optimized ‘Strategy Portfolios’. We use the RandomDistributor to generate randomly distributed porfolios and the KPIValues in order to determine the best combination. Read more…

H2O Sparkling Water – Distributed Random Forrest

H2O is a machine learning framework which has been implemented in Java and provides an API for Scala, Python and R. This framework has the following unique features: – It has a interactive web GUI (Flow) so that you can work w/o any programming – The generated trained models can be deployed easily in a production JVM environment with minimal dependencies. In this document we demonstrate how H2O can be used with Scala & Spark Read more…

Naive Bayes with Weka

If you are interested in Machine Learning in the JVM you should not forget about the good old Weka. It has basically been desinged to be used by a Swing GUI but it can also be used as an API. In terms of documentation I can recommend this manual and the javadoc. In my Demo I use the NaiveBayesMultinomial classifier with the iris dataset that is directly loaded from the Internet. The Jupyter notebook can Read more…

Exchanging Data between MLlib, DL4J and Shogun

Each machine learning framework has its own basic data model and it is part of the ‘data preparation’ to convert the existing data to the required input format of the specific framework. Most of these frameworks also provide a direct access to some predefined data-sets (e.g iris, mnist etc) I tried to do a quick overview on how to exchange the data between the following frameworks DeepLearning for Java (DL4J) Scala MLlib Shogun so that Read more…

Random Forrest Classifier in Spark ML

MLlib is Spark’s machine learning (ML) library. It’s goal is to make practical machine learning scalable and easy. I tried to make a complete step by step classification example using the Iris flower data set  using the BeakerX Jupyter kernel which covers the following steps Setup Data Preparation Testing and Prediction Validation The example is written in Scala but you could use any other language which is supported by the JVM. My example can be found the this GIST Read more…

Using Shogun in the JVM

Shogun is and open-source machine learning library that offers a wide range of efficient and unified machine learning methods. It is implemented in C++ and provides the necessary java integration so that it can be used in any language which is based on the JVM: – Java – Scala – Groovy – Kotlin – etc I tried to make Shogun easier to use in a JVM environment. The documentation how to use Shogun e.g. in Read more…

Deeplearning4j – Recurrent Neural Networks (RNN)

A recurrent neural network (RNN) is a class of artificial neural network where connections between nodes form a directed graph along a sequence. This allows it to exhibit temporal dynamic behavior for a time sequence. Unlike feedforward neural networks, RNNs can use their internal state (memory) to process sequences of inputs. I am showing a basic implementation of a RNN in DL4J. Further information can be found at https://deeplearning4j.org/docs/latest/deeplearning4j-nn-recurrent. This demo has been implemented in Read more…

Deeplearning4j – Iris Classification Example

I am a big fan of Keras – Fortunately we also have a similar framework available when we need to implement a solution which works in the JVM: DeepLearning4J. In this Blog I give a quick introduction of some of the key concepts which are needed to get started. I am using the Iris dataset to demonstrate the classification of data with the help of a neural network. This demo has been implemented in Scala Read more…

Investor – Automatic Trading with E-Trade

Hurray – Today I finally managed to finish the implementation of the integration into E-Trade. Unfortunately it turned out to be a little bit more complicated then initially thought and there were quite a few stumbling blocks around the topic of “authentication”. I have aleady demonstrated how to do trading with actual data. In this document I want to show how to do automatic trading with E-Trade. Information on how to request to access E-Trade Read more…